WorldWideScience

Sample records for reliable experimental approach

  1. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  2. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  3. New Approaches to Reliability Assessment

    DEFF Research Database (Denmark)

    Ma, Ke; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  4. Experimental Test and Simulations on a Linear Generator-Based Prototype of a Wave Energy Conversion System Designed with a Reliability-Oriented Approach

    Directory of Open Access Journals (Sweden)

    Valeria Boscaino

    2017-01-01

    Full Text Available In this paper, we propose a reliability-oriented design of a linear generator-based prototype of a wave energy conversion (WEC, useful for the production of hydrogen in a sheltered water area like Mediterranean Sea. The hydrogen production has been confirmed by a lot of experimental testing and simulations. The system design is aimed to enhance the robustness and reliability and is based on an analysis of the main WEC failures reported in literature. The results of this analysis led to some improvements that are applied to a WEC system prototype for hydrogen production and storage. The proposed WEC system includes the electrical linear generator, the power conversion system, and a sea-water electrolyzer. A modular architecture is conceived to provide ease of extension of the power capability of the marine plant. The experimental results developed on the permanent magnet linear electric generator have allowed identification of the stator winding typology and, consequently, ability to size the power electronics system. The produced hydrogen has supplied a low-power fuel cell stack directly connected to the hydrogen output from the electrolyzer. The small-scale prototype is designed to be installed, in the near future, into the Mediterranean Sea. As shown by experimental and simulation results, the small-scale prototype is suitable for hydrogen production and storage from sea water in this area.

  5. Experimental approaches and applications

    CERN Document Server

    Crasemann, Bernd

    1975-01-01

    Atomic Inner-Shell Processes, Volume II: Experimental Approaches and Applications focuses on the physics of atomic inner shells, with emphasis on experimental aspects including the use of radioactive atoms for studies of atomic transition probabilities. Surveys of modern techniques of electron and photon spectrometry are also presented, and selected practical applications of inner-shell processes are outlined. Comprised of six chapters, this volume begins with an overview of the general principles underlying the experimental techniques that make use of radioactive isotopes for inner-sh

  6. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  7. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  8. An approach for assessing human decision reliability

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    This paper presents a method to study human reliability in decision situations related to nuclear power plant disturbances. Decisions often play a significant role in handling of emergency situations. The method may be applied to probabilistic safety assessments (PSAs) in cases where decision making is an important dimension of an accident sequence. Such situations are frequent e.g. in accident management. In this paper, a modelling approach for decision reliability studies is first proposed. Then, a case study with two decision situations with relatively different characteristics is presented. Qualitative and quantitative findings of the study are discussed. In very simple decision cases with time pressure, time reliability correlation proved out to be a feasible reliability modelling method. In all other decision situations, more advanced probabilistic decision models have to be used. Finally, decision probability assessment by using simulator run results and expert judgement is presented

  9. Experimental research of fuel element reliability

    International Nuclear Information System (INIS)

    Cech, B.; Novak, J.; Chamrad, B.

    1980-01-01

    The rate and extent of the damage of the can integrity for fission products is the basic criterion of reliability. The extent of damage is measurable by the fission product leakage into the reactor coolant circuit. An analysis is made of the causes of the fuel element can damage and a model is proposed for testing fuel element reliability. Special experiments should be carried out to assess partial processes, such as heat transfer and fuel element surface temperature, fission gas liberation and pressure changes inside the element, corrosion weakening of the can wall, can deformation as a result of mechanical interactions. The irradiation probe for reliability testing of fuel elements is described. (M.S.)

  10. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  11. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  12. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  13. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  14. Experimental design a chemometric approach

    CERN Document Server

    Deming, SN

    1987-01-01

    Now available in a paperback edition is a book which has been described as ``...an exceptionally lucid, easy-to-read presentation... would be an excellent addition to the collection of every analytical chemist. I recommend it with great enthusiasm.'' (Analytical Chemistry). Unlike most current textbooks, it approaches experimental design from the point of view of the experimenter, rather than that of the statistician. As the reviewer in `Analytical Chemistry' went on to say: ``Deming and Morgan should be given high praise for bringing the principles of experimental design to the level of the p

  15. Novel approach for evaluation of service reliability for electricity customers

    Institute of Scientific and Technical Information of China (English)

    JIANG; John; N

    2009-01-01

    Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.

  16. Different Reliability Assessment Approaches for Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kramer, Morten Mejlhede; Sørensen, John Dalsgaard

    2015-01-01

    Reliability assessments are of importance for wave energy converters (WECs) due to the fact that accessibility might be limited in case of failure and maintenance. These failure rates can be adapted by reliability considerations. There are two different approaches to how reliability can...

  17. Assessing high reliability via Bayesian approach and accelerated tests

    International Nuclear Information System (INIS)

    Erto, Pasquale; Giorgio, Massimiliano

    2002-01-01

    Sometimes the assessment of very high reliability levels is difficult for the following main reasons: - the high reliability level of each item makes it impossible to obtain, in a reasonably short time, a sufficient number of failures; - the high cost of the high reliability items to submit to life tests makes it unfeasible to collect enough data for 'classical' statistical analyses. In the above context, this paper presents a Bayesian solution to the problem of estimation of the parameters of the Weibull-inverse power law model, on the basis of a limited number (say six) of life tests, carried out at different stress levels, all higher than the normal one. The over-stressed (i.e. accelerated) tests allow the use of experimental data obtained in a reasonably short time. The Bayesian approach enables one to reduce the required number of failures adding to the failure information the available a priori engineers' knowledge. This engineers' involvement conforms to the most advanced management policy that aims at involving everyone's commitment in order to obtain total quality. A Monte Carlo study of the non-asymptotic properties of the proposed estimators and a comparison with the properties of maximum likelihood estimators closes the work

  18. A Latent Class Approach to Estimating Test-Score Reliability

    Science.gov (United States)

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas

    2011-01-01

    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  19. Reliability-based optimal structural design by the decoupling approach

    International Nuclear Information System (INIS)

    Royset, J.O.; Der Kiureghian, A.; Polak, E.

    2001-01-01

    A decoupling approach for solving optimal structural design problems involving reliability terms in the objective function, the constraint set or both is discussed and extended. The approach employs a reformulation of each problem, in which reliability terms are replaced by deterministic functions. The reformulated problems can be solved by existing semi-infinite optimization algorithms and computational reliability methods. It is shown that the reformulated problems produce solutions that are identical to those of the original problems when the limit-state functions defining the reliability problem are affine. For nonaffine limit-state functions, approximate solutions are obtained by solving series of reformulated problems. An important advantage of the approach is that the required reliability and optimization calculations are completely decoupled, thus allowing flexibility in the choice of the optimization algorithm and the reliability computation method

  20. Parts and Components Reliability Assessment: A Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  1. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  2. Approach to developing reliable space reactor power systems

    International Nuclear Information System (INIS)

    Mondt, J.F.; Shinbrot, C.H.

    1991-01-01

    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  3. Experimental approach to explosive nucleosynthesis

    International Nuclear Information System (INIS)

    Kubono, S.

    1991-07-01

    Recent development of experimental studies on explosive nucleosynthesis, especially the rapid proton process and the primordial nucleosynthesis were discussed with a stress on unstable nuclei. New development in the experimental methods for the nuclear astrophysics is also discussed which use unstable nuclear beams. (author)

  4. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  5. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  6. Network reliability assessment using a cellular automata approach

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Moreno, Jose Ali

    2002-01-01

    Two cellular automata (CA) models that evaluate the s-t connectedness and shortest path in a network are presented. CA based algorithms enhance the performance of classical algorithms, since they allow a more reliable and straightforward parallel implementation resulting in a dynamic network evaluation, where changes in the connectivity and/or link costs can readily be incorporated avoiding recalculation from scratch. The paper also demonstrates how these algorithms can be applied for network reliability evaluation (based on Monte-Carlo approach) and for finding s-t path with maximal reliability

  7. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  8. Experimental Approach to Teaching Fluids

    Science.gov (United States)

    Stern, Catalina

    2015-11-01

    For the last 15 years we have promoted experimental work even in the theoretical courses. Fluids appear in the Physics curriculum of the National University of Mexico in two courses: Collective Phenomena in their sophomore year and Continuum Mechanics in their senior year. In both, students are asked for a final project. Surprisingly, at least 85% choose an experimental subject even though this means working extra hours every week. Some of the experiments were shown in this congress two years ago. This time we present some new results and the methodology we use in the classroom. I acknowledge support from the Physics Department, Facultad de Ciencias, UNAM.

  9. Engineering systems reliability, safety, and maintenance an integrated approach

    CERN Document Server

    Dhillon, B S

    2017-01-01

    Today, engineering systems are an important element of the world economy and each year billions of dollars are spent to develop, manufacture, operate, and maintain various types of engineering systems around the globe. Many of these systems are highly sophisticated and contain millions of parts. For example, a Boeing jumbo 747 is made up of approximately 4.5 million parts including fasteners. Needless to say, reliability, safety, and maintenance of systems such as this have become more important than ever before.  Global competition and other factors are forcing manufacturers to produce highly reliable, safe, and maintainable engineering products. Therefore, there is a definite need for the reliability, safety, and maintenance professionals to work closely during design and other phases. Engineering Systems Reliability, Safety, and Maintenance: An Integrated Approach eliminates the need to consult many different and diverse sources in the hunt for the information required to design better engineering syste...

  10. Reliability of four experimental mechanical pain tests in children

    DEFF Research Database (Denmark)

    Søe, Ann-Britt Langager; Thomsen, Lise L; Tornoe, Birte

    2013-01-01

    In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT) in healthy children. Furthermore, the aim was a...... was also to study the intersession reliability of the following four tests: (1) Total Tenderness Score; (2) PPT; (3) Visual Analog Scale score at suprapressure pain threshold; and (4) area under the curve (stimulus-response functions for pressure versus pain).......In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT) in healthy children. Furthermore, the aim...

  11. Relevance and reliability of experimental data in human health risk assessment of pesticides.

    Science.gov (United States)

    Kaltenhäuser, Johanna; Kneuer, Carsten; Marx-Stoelting, Philip; Niemann, Lars; Schubert, Jens; Stein, Bernd; Solecki, Roland

    2017-08-01

    Evaluation of data relevance, reliability and contribution to uncertainty is crucial in regulatory health risk assessment if robust conclusions are to be drawn. Whether a specific study is used as key study, as additional information or not accepted depends in part on the criteria according to which its relevance and reliability are judged. In addition to GLP-compliant regulatory studies following OECD Test Guidelines, data from peer-reviewed scientific literature have to be evaluated in regulatory risk assessment of pesticide active substances. Publications should be taken into account if they are of acceptable relevance and reliability. Their contribution to the overall weight of evidence is influenced by factors including test organism, study design and statistical methods, as well as test item identification, documentation and reporting of results. Various reports make recommendations for improving the quality of risk assessments and different criteria catalogues have been published to support evaluation of data relevance and reliability. Their intention was to guide transparent decision making on the integration of the respective information into the regulatory process. This article describes an approach to assess the relevance and reliability of experimental data from guideline-compliant studies as well as from non-guideline studies published in the scientific literature in the specific context of uncertainty and risk assessment of pesticides. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Integrated approach to economical, reliable, safe nuclear power production

    International Nuclear Information System (INIS)

    1982-06-01

    An Integrated Approach to Economical, Reliable, Safe Nuclear Power Production is the latest evolution of a concept which originated with the Defense-in-Depth philosophy of the nuclear industry. As Defense-in-Depth provided a framework for viewing physical barriers and equipment redundancy, the Integrated Approach gives a framework for viewing nuclear power production in terms of functions and institutions. In the Integrated Approach, four plant Goals are defined (Normal Operation, Core and Plant Protection, Containment Integrity and Emergency Preparedness) with the attendant Functional and Institutional Classifications that support them. The Integrated Approach provides a systematic perspective that combines the economic objective of reliable power production with the safety objective of consistent, controlled plant operation

  13. Some approaches to system reliability improvement in engineering design

    International Nuclear Information System (INIS)

    Shen, Kecheng.

    1990-01-01

    In this thesis some approaches to system reliability improvement in engineering design are studied. In particular, the thesis aims at developing alternative methodologies for ranking of component importance which are more related to the design practice and which are more useful in system synthesis than the existing ones. It also aims at developing component reliability models by means of stress-strength interference which will enable both component reliability prediction and design for reliability. A new methodology for ranking of component importance is first developed based on the notion of the increase of the expected system yield. This methodology allows for incorporation of different improvement actions at the component level such as parallel redundancy, standby redundancy, burn-in, minimal repair and perfect replacement. For each of these improvement actions, the increase of system reliability is studied and used as the component importance measure. A possible connection between the commonly known models of component lifetimes and the stress-strength interference models is suggested. Under some general conditions the relationship between component failure rate and the stress and strength distribution characteristics is studied. A heuristic approach for obtaining bounds on failure probability through stress-strength interference is also presented. A case study and a worked example are presented, which illustrate and verify the developed importance measures and their applications in the analytical as well as synthetical work of engineering design. (author)

  14. Reliability of four experimental mechanical pain tests in children

    Directory of Open Access Journals (Sweden)

    Soee AL

    2013-02-01

    Full Text Available Ann-Britt L Soee,1 Lise L Thomsen,2 Birte Tornoe,1,3 Liselotte Skov11Department of Pediatrics, Children’s Headache Clinic, Copenhagen University Hospital Herlev, Copenhagen, Denmark; 2Department of Neuropediatrics, Juliane Marie Centre, Copenhagen University Hospital Rigshospitalet, København Ø, Denmark; 3Department of Physiotherapy, Medical Department O, Copenhagen University Hospital Herlev, Herlev, DenmarkPurpose: In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT in healthy children. Furthermore, the aim was also to study the intersession reliability of the following four tests: (1 Total Tenderness Score; (2 PPT; (3 Visual Analog Scale score at suprapressure pain threshold; and (4 area under the curve (stimulus–response functions for pressure versus pain.Participants and methods: Twenty-five healthy school children, 8–14 years of age, participated. Test 2, PPT, was repeated three times at 2 minute intervals on the same day to estimate PPT intrasession reliability using Cronbach’s alpha. Tests 1–4 were repeated after median 21 (interquartile range 10.5–22 days, and Pearson’s correlation coefficient was used to describe the intersession reliability.Results: The PPT test was precise and reliable (Cronbach’s alpha ≥ 0.92. All tests showed a good to excellent correlation between days (intersessions r = 0.66–0.81. There were no indications of significant systematic differences found in any of the four tests between days.Conclusion: All tests seemed to be reliable measurements in pain evaluation in healthy children aged 8–14 years. Given the small sample size, this conclusion needs to be confirmed in future studies.Keywords: repeatability, intraindividual reliability, pressure pain threshold, pain measurement, algometer

  15. Approach to assurance of reliability of linear accelerator operation observations

    International Nuclear Information System (INIS)

    Bakov, S.M.; Borovikov, A.A.; Kavkun, S.L.

    1994-01-01

    The system approach to solving the task of assuring reliability of observations over the linear accelerator operation is proposed. The basic principles of this method consist in application of dependences between the facility parameters, decrease in the number of the system apparatus channels for data acquisition without replacement of failed channel by reserve one. The signal commutation unit, the introduction whereof into the data acquisition system essentially increases the reliability of the measurement system on the account of active reserve, is considered detail. 8 refs. 6 figs

  16. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  17. Predicting risk and human reliability: a new approach

    International Nuclear Information System (INIS)

    Duffey, R.; Ha, T.-S.

    2009-01-01

    Learning from experience describes human reliability and skill acquisition, and the resulting theory has been validated by comparison against millions of outcome data from multiple industries and technologies worldwide. The resulting predictions were used to benchmark the classic first generation human reliability methods adopted in probabilistic risk assessments. The learning rate, probabilities and response times are also consistent with the existing psychological models for human learning and error correction. The new approach also implies a finite lower bound probability that is not predicted by empirical statistical distributions that ignore the known and fundamental learning effects. (author)

  18. An approach for assessing ALWR passive safety system reliability

    International Nuclear Information System (INIS)

    Hake, T.M.

    1991-01-01

    Many of the advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive rather than active systems to perform safety functions. Despite the reduced redundancy of the passive systems as compared to active systems in current plants, the assertion is that the overall safety of the plant is enhanced due to the much higher expected reliability of the passive systems. In order to investigate this assertion, a study is being conducted at Sandia National Laboratories to evaluate the reliability of ALWR passive safety features in the context of probabilistic risk assessment (PRA). The purpose of this paper is to provide a brief overview of the approach to this study. The quantification of passive system reliability is not as straightforward as for active systems, due to the lack of operating experience, and to the greater uncertainty in the governing physical phenomena. Thus, the adequacy of current methods for evaluating system reliability must be assessed, and alternatives proposed if necessary. For this study, the Westinghouse Advanced Passive 600 MWe reactor (AP600) was chosen as the advanced reactor for analysis, because of the availability of AP600 design information. This study compares the reliability of AP600 emergency cooling system with that of corresponding systems in a current generation reactor

  19. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  20. Unlocking water markets: an experimental approach

    Science.gov (United States)

    Cook, J.; Rabotyagov, S.

    2011-12-01

    Water markets are frequently referred to as a promising approach to alleviate stress on water systems, especially as future hydrologic assessments suggest increasing demand and less reliable supply. Yet, despite decades of advocacy by water resource economists, water markets (leases and sales of water rights between willing buyers and sellers) have largely failed to develop in the western US. Although there are a number of explanations for this failure, we explore one potential reason that has received less attention : farmers as sellers may have preferences for different elements of a water market transaction that are not captured in the relative comparison of their profits from farming and their profits from agreeing to a deal. We test this explanation by recruiting irrigators with senior water rights in the upper Yakima River Basin in Washington state to participate in a series of experimental auctions. In concept, the Yakima Basin is well situated for water market transactions as it has significant water shortages for junior water users ~15% of years and projections show these are likely to increase in the future. Participants were asked a series of questions about the operation of a hypothetical 100-acre timothy hay farm including the type of buyer, how the water bank is managed, the lease type, and the offer price. Results from 7 sessions with irrigators (n=49) and a comparison group of undergraduates (n=38) show that irrigators are more likely to accept split-season than full-season leases (controlling for differences in farm profits) and are more likely to accept a lease from an irrigation district and less likely to accept an offer from a Developer. Most notably, we find farmers were far more likely than students to reject offers from buyers even though it would increase their winnings from the experiment. These results could be used in ongoing water supply policy debates in the Yakima Basin to simulate the amount of water that could be freed by water

  1. Reliability analysis with linguistic data: An evidential network approach

    International Nuclear Information System (INIS)

    Zhang, Xiaoge; Mahadevan, Sankaran; Deng, Xinyang

    2017-01-01

    In practical applications of reliability assessment of a system in-service, information about the condition of a system and its components is often available in text form, e.g., inspection reports. Estimation of the system reliability from such text-based records becomes a challenging problem. In this paper, we propose a four-step framework to deal with this problem. In the first step, we construct an evidential network with the consideration of available knowledge and data. Secondly, we train a Naive Bayes text classification algorithm based on the past records. By using the trained Naive Bayes algorithm to classify the new records, we build interval basic probability assignments (BPA) for each new record available in text form. Thirdly, we combine the interval BPAs of multiple new records using an evidence combination approach based on evidence theory. Finally, we propagate the interval BPA through the evidential network constructed earlier to obtain the system reliability. Two numerical examples are used to demonstrate the efficiency of the proposed method. We illustrate the effectiveness of the proposed method by comparing with Monte Carlo Simulation (MCS) results. - Highlights: • We model reliability analysis with linguistic data using evidential network. • Two examples are used to demonstrate the efficiency of the proposed method. • We compare the results with Monte Carlo Simulation (MCS).

  2. The DYLAM approach for the dynamic reliability analysis of systems

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo

    1996-01-01

    In many real systems, failures occurring to the components, control failures and human interventions often interact with the physical system evolution in such a way that a simple reliability analysis, de-coupled from process dynamics, is very difficult or even impossible. In the last ten years many dynamic reliability approaches have been proposed to properly assess the reliability of these systems characterized by dynamic interactions. The DYLAM methodology, now implemented in its latest version, DYLAM-3, offers a powerful tool for integrating deterministic and failure events. This paper describes the main features of the DYLAM-3 code with reference to the classic fault-tree and event-tree techniques. Some aspects connected to the practical problems underlying dynamic event-trees are also discussed. A simple system, already analyzed with other dynamic methods is used as a reference for the numerical applications. The same system is also studied with a time-dependent fault-tree approach in order to show some features of dynamic methods vs classical techniques. Examples including stochastic failures, without and with repair, failures on demand and time dependent failure rates give an extensive overview of DYLAM-3 capabilities

  3. Experimental design research approaches, perspectives, applications

    CERN Document Server

    Stanković, Tino; Štorga, Mario

    2016-01-01

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  4. Molecular approach of uranyl/mineral surfaces: experimental approach

    International Nuclear Information System (INIS)

    Drot, R.

    2009-01-01

    The author reports an experimental approach in which different spectroscopic approaches are coupled (laser spectroscopy, X-ray absorption spectroscopy, vibrational spectroscopy) to investigate the mechanisms controlling actinide sorption processes by different substrates, in order to assess radioactive waste storage site safety. Different substrates have been considered: monocrystalline or powdered TiO 2 , montmorillonite, and gibbsite

  5. An approach for assessing ALWR passive safety system reliability

    International Nuclear Information System (INIS)

    Hake, T.M.

    1991-01-01

    Many advanced light water reactor designs incorporate passive rather than active safety features for front-line accident response. A method for evaluating the reliability of these passive systems in the context of probabilistic risk assessment has been developed at Sandia National Laboratories. This method addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. These processes provide the system's driving force; examples are natural circulation and gravity-induced injection. This paper describes the method, and provides some preliminary results of application of the approach to the Westinghouse AP600 design

  6. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  7. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  8. Failure mode and effect analysis experimental reliability determination for the CANDU reactor equipment

    International Nuclear Information System (INIS)

    Vieru, G.

    1996-01-01

    This paper describes the experimental tests performed in order to prove the reliability parameters for certain equipment manufactured in INR Pitesti, for NPP Cernavoda. The tests were provided by Technical Specifications and test procedures. A comparison, referring to the reliability parameters, between Canadian equipment and INR manufactured equipment ones is also given. The results of tests and conclusions are shown. (author)

  9. Reliability optimization using multiobjective ant colony system approaches

    International Nuclear Information System (INIS)

    Zhao Jianhua; Liu Zhaoheng; Dao, M.-T.

    2007-01-01

    The multiobjective ant colony system (ACS) meta-heuristic has been developed to provide solutions for the reliability optimization problem of series-parallel systems. This type of problems involves selection of components with multiple choices and redundancy levels that produce maximum benefits, and is subject to the cost and weight constraints at the system level. These are very common and realistic problems encountered in conceptual design of many engineering systems. It is becoming increasingly important to develop efficient solutions to these problems because many mechanical and electrical systems are becoming more complex, even as development schedules get shorter and reliability requirements become very stringent. The multiobjective ACS algorithm offers distinct advantages to these problems compared with alternative optimization methods, and can be applied to a more diverse problem domain with respect to the type or size of the problems. Through the combination of probabilistic search, multiobjective formulation of local moves and the dynamic penalty method, the multiobjective ACSRAP, allows us to obtain an optimal design solution very frequently and more quickly than with some other heuristic approaches. The proposed algorithm was successfully applied to an engineering design problem of gearbox with multiple stages

  10. Efficient approach for reliability-based optimization based on weighted importance sampling approach

    International Nuclear Information System (INIS)

    Yuan, Xiukai; Lu, Zhenzhou

    2014-01-01

    An efficient methodology is presented to perform the reliability-based optimization (RBO). It is based on an efficient weighted approach for constructing an approximation of the failure probability as an explicit function of the design variables which is referred to as the ‘failure probability function (FPF)’. It expresses the FPF as a weighted sum of sample values obtained in the simulation-based reliability analysis. The required computational effort for decoupling in each iteration is just single reliability analysis. After the approximation of the FPF is established, the target RBO problem can be decoupled into a deterministic one. Meanwhile, the proposed weighted approach is combined with a decoupling approach and a sequential approximate optimization framework. Engineering examples are given to demonstrate the efficiency and accuracy of the presented methodology

  11. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Science.gov (United States)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  12. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    Keywords: Reliability - Power System - Bayes Theorem - Weibull Model - Probability. ... ensure a series of estimated parameter (failure rate, mean time to failure, function .... only on random variable r.v. describing the operating conditions: ..... Multivariate performance reliability prediction in real-time, Reliability Engineering.

  13. Toward a Cooperative Experimental System Development Approach

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1997-01-01

    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...

  14. Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach.

    Science.gov (United States)

    Friedländer, Felix; Bohmann, Ferdinand; Brunkhorst, Max; Chae, Ju-Hee; Devraj, Kavi; Köhler, Yvette; Kraft, Peter; Kuhn, Hannah; Lucaciu, Alexandra; Luger, Sebastian; Pfeilschifter, Waltraud; Sadler, Rebecca; Liesz, Arthur; Scholtyschik, Karolina; Stolz, Leonie; Vutukuri, Rajkumar; Brunkhorst, Robert

    2017-08-01

    Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

  15. [Cognitive experimental approach to anxiety disorders].

    Science.gov (United States)

    Azaïs, F

    1995-01-01

    Cognitive psychology is proposing a functional model to explain the mental organisation leading to emotional disorders. Among these disorders, anxiety spectrum represents a domain in which this model seems to be interesting for an efficient and comprehensive approach of the pathology. Number of behavioral or cognitive psychotherapeutic methods are relating to these cognitive references, but the theorical concepts of cognitive "shemata" or cognitive "processes" evoked to describe mental functioning in anxiety need an experimental approach for a better rational understanding. Cognitive function as perception, attention or memory can be explored in this domaine in an efficient way, allowing a more precise study of each stage of information processing. The cognitive model proposed in the psychopathology of anxiety suggests that anxious subjects are characterized by biases in processing of emotionally valenced information. This hypothesis suggests functional interference in information processing in these subjects, leading to an anxious response to the most of different stimuli. Experimental approach permit to explore this hypothesis, using many tasks for testing different cognitive dysfunction evoked in the anxious cognitive organisation. Impairments revealed in anxiety disorders seem to result from specific biases in threat-related information processing, involving several stages of cognitive processes. Semantic interference, attentional bias, implicit memory bias and priming effect are the most often disorders observed in anxious pathology, like simple phobia, generalised anxiety, panic disorder or post-traumatic stress disorder. These results suggest a top-down organisation of information processing in anxious subjects, who tend to detect, perceive and label many situations as threatening experience. The processes of reasoning and elaboration are consequently impaired in their adaptative function to threat, leading to the anxious response observed in clinical

  16. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  17. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  18. Building and integrating reliability models in a Reliability-Centered-Maintenance approach

    International Nuclear Information System (INIS)

    Verite, B.; Villain, B.; Venturini, V.; Hugonnard, S.; Bryla, P.

    1998-03-01

    Electricite de France (EDF) has recently developed its OMF-Structures method, designed to optimize preventive maintenance of passive structures such as pipes and support, based on risk. In particular, reliability performances of components need to be determined; it is a two-step process, consisting of a qualitative sort followed by a quantitative evaluation, involving two types of models. Initially, degradation models are widely used to exclude some components from the field of preventive maintenance. The reliability of the remaining components is then evaluated by means of quantitative reliability models. The results are then included in a risk indicator that is used to directly optimize preventive maintenance tasks. (author)

  19. Approach for an integral power transformer reliability model

    NARCIS (Netherlands)

    Schijndel, van A.; Wouters, P.A.A.F.; Steennis, E.F.; Wetzer, J.M.

    2012-01-01

    In electrical power transmission and distribution networks power transformers represent a crucial group of assets both in terms of reliability and investments. In order to safeguard the required quality at acceptable costs, decisions must be based on a reliable forecast of future behaviour. The aim

  20. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  1. Monte Carlo simulation - a powerful tool to support experimental activities in structure reliability

    International Nuclear Information System (INIS)

    Yuritzinn, T.; Chapuliot, S.; Eid, M.; Masson, R.; Dahl, A.; Moinereau, D.

    2003-01-01

    Monte-Carlo Simulation (MCS) can have different uses in supporting structure reliability investigations and assessments. In this paper we focus our interest on the use of MCS as a numerical tool to support the fitting of the experimental data related to toughness experiments. (authors)

  2. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity.

    Science.gov (United States)

    Dawson, Andreas; Raphael, Karen G; Glaros, Alan; Axelsson, Susanna; Arima, Taro; Ernberg, Malin; Farella, Mauro; Lobbezoo, Frank; Manfredini, Daniele; Michelotti, Ambra; Svensson, Peter; List, Thomas

    2013-01-01

    To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity assessment, (4) reliability and discriminitive validity assessment, and (5) instrument refinement. The kappa value and phi-coefficient were calculated to assess inter-observer reliability and discriminative ability, respectively. Following preliminary decisions and a literature review, a list of 52 items to be considered for inclusion in the tool was compiled. Eleven experts were invited to join a Delphi panel and 10 accepted. Four Delphi rounds reduced the preliminary tool-Quality-Assessment Tool for Experimental Bruxism Studies (Qu-ATEBS)- to 8 items: study aim, study sample, control condition or group, study design, experimental bruxism task, statistics, interpretation of results, and conflict of interest statement. Consensus among the Delphi panelists yielded good face validity. Inter-observer reliability was acceptable (k = 0.77). Discriminative validity was excellent (phi coefficient 1.0; P reviews of experimental bruxism studies, exhibits face validity, excellent discriminative validity, and acceptable inter-observer reliability. Development of quality assessment tools for many other topics in the orofacial pain literature is needed and may follow the described procedure.

  3. Assuring the reliability of structural components - experimental data and non-destructive examination requirements

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1984-01-01

    The probability of failure of a structural component can be estimated by either statistical methods or a probabilistic structural reliability approach (where the failure is seen as a level crossing of a damage stochastic process which develops in space and in time). The probabilistic approach has the advantage that it makes available not only an absolute value of the failure probability but also a lot of additional information. The disadvantage of the probabilistic approach is its complexity. It is discussed for the following situations: reliability of a structural component, material properties, data for fatigue crack growth evaluation, a bench mark exercise on reactor pressure vessel failure probability computation, and non-destructive examination for assuring a given level of structural reliability. (U.K.)

  4. Experimental approaches to heavy ion fusion

    International Nuclear Information System (INIS)

    Obayashi, H.; Fujii-e, Y.; Yamaki, T.

    1986-01-01

    As a feasibility study on heavy-ion-beam induced inertial fusion (HIF) approach, a conceptual plant design called HIBLIC-I has been worked out since 1982. The characteristic features of this design are summarized. To experimentally confirm them and prove them at least in principle, considerations are made on possible experimental programs to give substantial information on these critical phenomena. In HIBLIC-I, an accelerator complex is adopted as driver system to provide 6 beams of 208 Pb +1 ions at 15 GeV, which will be simultaneously focussed on a single shell, three layered target. The target is designed to give an energy gain of 100, so that the total beam energy of 4 MJ with 160 TW power may release 400 MJ fusion energy. A reactor chamber is cylindrical with double-walled structure made of HT-9. There are three layers of liquid Li flow inside the reactor. The innermost layer forms a Li curtain which is effective to recover the residual cavity pressure. A thick upward flow serves as coolant and tritium breeder. Tritium will be recovered by yttrium gettering system. A driver system is operated at the repetition rate of 10 Hz and supplies beams for 10 reactor chambers. Then the plant yield of fusion power becomes 4000 MWt, corresponding a net electric output of 1.5 GW. Experimental programs related to HIBLIC-I is described and discussed, including those for heavy-ion-beam experiments and proposals for lithium curtain by electron beam to clarify the key phenomena in HIBLIC-I cavity. (Nogami, K.)

  5. A penalty guided stochastic fractal search approach for system reliability optimization

    International Nuclear Information System (INIS)

    Mellal, Mohamed Arezki; Zio, Enrico

    2016-01-01

    Modern industry requires components and systems with high reliability levels. In this paper, we address the system reliability optimization problem. A penalty guided stochastic fractal search approach is developed for solving reliability allocation, redundancy allocation, and reliability–redundancy allocation problems. Numerical results of ten case studies are presented as benchmark problems for highlighting the superiority of the proposed approach compared to others from literature. - Highlights: • System reliability optimization is investigated. • A penalty guided stochastic fractal search approach is developed. • Results of ten case studies are compared with previously published methods. • Performance of the approach is demonstrated.

  6. System ergonomics as an approach to improve human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1988-01-01

    The application of system technics on ergonomical problems is called system ergonomics. This enables improvements of human reliability by design measures. The precondition for this is the knowledge of how information processing is performed by man and machine. By a separate consideration of sensory processing, cognitive processing, and motory processing it is possible to have a more exact idea of the system element 'man'. The system element 'machine' is well described by differential equations which allow an ergonomical assessment of the manouverability. The knowledge of information processing of man and machine enables a task analysis. This makes appear on one hand the human boundaries depending on the different properties of the task and on the other hand suitable ergonomical solution proposals which improve the reliability of the total system. It is a disadvantage, however, that the change of human reliability by such measures may not be quoted numerically at the moment. (orig.)

  7. Evaluation of reliability assurance approaches to operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1984-01-01

    This report discusses the results of research to evaluate existing and/or recommended safety/reliability assurance activities among nuclear and other high technology industries for potential nuclear industry implementation. Since the Three Mile Island (TMI) accident, there has been increased interest in the use of reliability programs (RP) to assure the performance of nuclear safety systems throughout the plant's lifetime. Recently, several Nuclear Regulatory Commission (NRC) task forces or safety issue review groups have recommended RPs for assuring the continuing safety of nuclear reactor plants. 18 references

  8. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    Subsequently, Bayesian methodologies are framed in an ampler problem list, based on the definition of an opportune "vector of state" and of a vector describing the system performances, aiming to the definition and the calculation or the estimation of system reliability. The purpose of our work is to establish a useful model ...

  9. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi

    2014-01-01

    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  10. Experimental approach to Chernobyl hot particles

    International Nuclear Information System (INIS)

    Tcherkezian, V.; Shkinev, V.; Khitrov, L.; Kolesov, G.

    1994-01-01

    An experimental approach to the investigation of Chernobyl hot particles and some results are presented in this study. Hot particles (HP) were picked out from soil samples collected during the 1986-1990 radiogeochemical expeditions in the contaminated zone (within 30 km of the Nuclear Power Plant). A number of hot particles were studied to estimate their contribution to the total activity, investigate their surface morphology and determine the size distribution. Hot particles contribution to the total activity in the 30 km zone was found to be not less than 65%. Investigation of HP element composition (by neutron activation analysis and EPMA) and radionuclide composition (direct alpha- and gamma-spectrometry, including determination of Pu and Am in Hp) revealed certain peculiarities of HP, collected in the vicinity of the damaged Nuclear Power Plant. Some particles were shown to contain uranium and fission products in proportion to one another, correlating with those in the partially burnt fuel, which proves their 'fuel' origin. Another part of the HP samples has revealed element fractionation as well as the presence of some terrestrial components. (Author)

  11. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  12. Using Bayesian belief networks for reliability management : construction and evaluation: a step by step approach

    NARCIS (Netherlands)

    Houben, M.J.H.A.

    2010-01-01

    In the capital goods industry, there is a growing need to manage reliability throughout the product development process. A number of trends can be identified that have a strong effect on the way in which reliability prediction and management is approached, i.e.: - The lifecycle costs approach that

  13. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  14. Monolithic QCL design approaches for improved reliability and affordability

    Science.gov (United States)

    Law, K. K.

    2013-12-01

    Many advances have been made recently in mid-wave infrared and long-wave infrared quantum cascade lasers (QCLs) technologies, and there is an increasing demand for these laser sources for ever expanding Naval, DoD and homeland security applications. We will discuss in this paper a portfolio of various Naval Air Warfare Weapons Division's current and future small business innovative research programs and efforts on significantly improving QCLs' performance, affordability, and reliability.

  15. A quantitative approach to wind farm diversification and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Degeilh, Yannick; Singh, Chanan [Department of Electrical and Computer Engineering, Texas A and M University, College Station, TX 77843 (United States)

    2011-02-15

    This paper proposes a general planning method to minimize the variance of aggregated wind farm power output by optimally distributing a predetermined number of wind turbines over a preselected number of potential wind farming sites. The objective is to facilitate high wind power penetration through the search for steadier overall power output. Another optimization formulation that takes into account the correlations between wind power outputs and load is also presented. Three years of wind data from the recent NREL/3TIER study in the western US provides the statistics for evaluating each site upon their mean power output, variance and correlation with each other so that the best allocations can be determined. The reliability study reported in this paper investigates the impact of wind power output variance reduction on a power system composed of a virtual wind power plant and a load modeled from the 1996 IEEE RTS. Some traditional reliability indices such as the LOLP are calculated and it is eventually shown that configurations featuring minimal global power output variances generally prove the most reliable provided the sites are not significantly correlated with the modeled load. Consequently, the choice of uncorrelated/negatively correlated sites is favored. (author)

  16. A quantitative approach to wind farm diversification and reliability

    International Nuclear Information System (INIS)

    Degeilh, Yannick; Singh, Chanan

    2011-01-01

    This paper proposes a general planning method to minimize the variance of aggregated wind farm power output by optimally distributing a predetermined number of wind turbines over a preselected number of potential wind farming sites. The objective is to facilitate high wind power penetration through the search for steadier overall power output. Another optimization formulation that takes into account the correlations between wind power outputs and load is also presented. Three years of wind data from the recent NREL/3TIER study in the western US provides the statistics for evaluating each site upon their mean power output, variance and correlation with each other so that the best allocations can be determined. The reliability study reported in this paper investigates the impact of wind power output variance reduction on a power system composed of a virtual wind power plant and a load modeled from the 1996 IEEE RTS. Some traditional reliability indices such as the LOLP are calculated and it is eventually shown that configurations featuring minimal global power output variances generally prove the most reliable provided the sites are not significantly correlated with the modeled load. Consequently, the choice of uncorrelated/negatively correlated sites is favored. (author)

  17. Ensemble of different approaches for a reliable person re-identification system

    Directory of Open Access Journals (Sweden)

    Loris Nanni

    2016-07-01

    Full Text Available An ensemble of approaches for reliable person re-identification is proposed in this paper. The proposed ensemble is built combining widely used person re-identification systems using different color spaces and some variants of state-of-the-art approaches that are proposed in this paper. Different descriptors are tested, and both texture and color features are extracted from the images; then the different descriptors are compared using different distance measures (e.g., the Euclidean distance, angle, and the Jeffrey distance. To improve performance, a method based on skeleton detection, extracted from the depth map, is also applied when the depth map is available. The proposed ensemble is validated on three widely used datasets (CAVIAR4REID, IAS, and VIPeR, keeping the same parameter set of each approach constant across all tests to avoid overfitting and to demonstrate that the proposed system can be considered a general-purpose person re-identification system. Our experimental results show that the proposed system offers significant improvements over baseline approaches. The source code used for the approaches tested in this paper will be available at https://www.dei.unipd.it/node/2357 and http://robotics.dei.unipd.it/reid/.

  18. Some developments in human reliability analysis approaches and tools

    Energy Technology Data Exchange (ETDEWEB)

    Hannaman, G W; Worledge, D H

    1988-01-01

    Since human actions have been recognized as an important contributor to safety of operating plants in most industries, research has been performed to better understand and account for the way operators interact during accidents through the control room and equipment interface. This paper describes the integration of a series of research projects sponsored by the Electric Power Research Institute to strengthen the methods for performing the human reliability analysis portion of the probabilistic safety studies. It focuses on the analytical framework used to guide the analysis, the development of the models for quantifying time-dependent actions, and simulator experiments used to validate the models.

  19. Identification of Black Spots Based on Reliability Approach

    Directory of Open Access Journals (Sweden)

    Ahmadreza Ghaffari

    2013-12-01

    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  20. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  1. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  2. A priori and a posteriori approaches in human reliability

    International Nuclear Information System (INIS)

    Griffon-Fouco, M.; Gagnolet, P.

    1981-09-01

    The French atomic energy commission (CEA) and the French supplier in electric power (EDF) have joint studies on human factors in nuclear safety. This paper deals with these studies which are a combination of two approaches: - An a posteriori approach so as to know the rate of human errors and their causes: an analysis of incident data banks and an analysis of human errors on simulator are presented. - An a priori approach so as to know the potential factors of human errors: an analysis of the control rooms design and an analysis of the writing of procedures are presented. The possibility to take into account these two approaches to prevent and quantify human errors is discussed

  3. Sensitivity based reduced approaches for structural reliability analysis

    Indian Academy of Sciences (India)

    captured by a safety-factor based approach due to the intricate nonlinear ... give the accounts of extensive research works which have been done over ... (ii) simulation based methods, for example, importance sampling (Bucher 1988; Mahade-.

  4. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  5. A New Approach to Structural Reliability in Fatigue Failure

    Science.gov (United States)

    1998-03-01

    6. AUTHOR(S) Dr. Sia Nemat-Nasser (PI) Dr. Joseph Zarka 7. PERFORMING ORGANIZATION NAMES(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION UniveBity of...their own problems. IV PUBLICATIONS Huang, J., J. Zarka and P. Navidi, "A New Approach in Relaibility of Welded Structures," Proceedings of the ASCE

  6. Correlating neutron yield and reliability for selecting experimental parameters for a plasma focus machine

    International Nuclear Information System (INIS)

    Pross, G.

    Possibilities of optimizing focus machines with a given energy content in the sense of high neutron yield and high reliability of the discharges are investigated experimentally. For this purpose, a focus machine of the Mather type with an energy content of 12 kJ was constructed. The following experimental parameters were varied: the material of the insulator in the ignition zone, the structure of the outside electrode, the length of the inside electrode, the filling pressure and the amount and polarity of the battery voltage. An important part of the diagnostic program consists of measurements of the azimuthal and axial current distribution in the accelerator, correlated with short-term photographs of the luminous front as a function of time. The results are given. A functional schematic has been drafted for focus discharge as an aid in extensive optimization of focus machines, combining findings from theory and experiments. The schematic takes into account the multiparameter character of the discharge and clarifies relationships between the experimental parameters and the target variables neutron yield and reliability

  7. A damage mechanics based approach to structural deterioration and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Bhattcharya, B.; Ellingwood, B. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions to the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.

  8. A damage mechanics based approach to structural deterioration and reliability

    International Nuclear Information System (INIS)

    Bhattcharya, B.; Ellingwood, B.

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions to the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed

  9. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    Science.gov (United States)

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  10. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  11. Reliability and E.M.C. approach to power systems

    International Nuclear Information System (INIS)

    Pantucek, E.

    2012-01-01

    Respect to the requirements and principles of electromagnetic compatibility is a basic requirement for the interaction of electrical and electronic subsystems in complex solutions. Electromagnetic compatibility is not only the respect of complex of the requirements for emissions and immunity, it is also a solution of interconnection, equipotential bonding, shielding and filtering, and not least of heuristic multidisciplinary approach to the electrical installation. Respect to principles of electromagnetic compatibility is a means of guaranteeing of sustainable use of electronic devices in power system. (Authors)

  12. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  13. Computational and Experimental Approaches to Visual Aesthetics

    Science.gov (United States)

    Brachmann, Anselm; Redies, Christoph

    2017-01-01

    Aesthetics has been the subject of long-standing debates by philosophers and psychologists alike. In psychology, it is generally agreed that aesthetic experience results from an interaction between perception, cognition, and emotion. By experimental means, this triad has been studied in the field of experimental aesthetics, which aims to gain a better understanding of how aesthetic experience relates to fundamental principles of human visual perception and brain processes. Recently, researchers in computer vision have also gained interest in the topic, giving rise to the field of computational aesthetics. With computing hardware and methodology developing at a high pace, the modeling of perceptually relevant aspect of aesthetic stimuli has a huge potential. In this review, we present an overview of recent developments in computational aesthetics and how they relate to experimental studies. In the first part, we cover topics such as the prediction of ratings, style and artist identification as well as computational methods in art history, such as the detection of influences among artists or forgeries. We also describe currently used computational algorithms, such as classifiers and deep neural networks. In the second part, we summarize results from the field of experimental aesthetics and cover several isolated image properties that are believed to have a effect on the aesthetic appeal of visual stimuli. Their relation to each other and to findings from computational aesthetics are discussed. Moreover, we compare the strategies in the two fields of research and suggest that both fields would greatly profit from a joined research effort. We hope to encourage researchers from both disciplines to work more closely together in order to understand visual aesthetics from an integrated point of view. PMID:29184491

  14. A heuristic-based approach for reliability importance assessment of energy producers

    International Nuclear Information System (INIS)

    Akhavein, A.; Fotuhi Firuzabad, M.

    2011-01-01

    Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: → Required reliability level at load points is a concern in modern power systems. → It is important to assess reliability importance of energy producers or generators. → Generators can be ranked based on their impacts on power flow to a selected area. → Ranking of generators is an efficient tool to assess their reliability importance.

  15. A hybrid approach to quantify software reliability in nuclear safety systems

    International Nuclear Information System (INIS)

    Arun Babu, P.; Senthil Kumar, C.; Murali, N.

    2012-01-01

    Highlights: ► A novel method to quantify software reliability using software verification and mutation testing in nuclear safety systems. ► Contributing factors that influence software reliability estimate. ► Approach to help regulators verify the reliability of safety critical software system during software licensing process. -- Abstract: Technological advancements have led to the use of computer based systems in safety critical applications. As computer based systems are being introduced in nuclear power plants, effective and efficient methods are needed to ensure dependability and compliance to high reliability requirements of systems important to safety. Even after several years of research, quantification of software reliability remains controversial and unresolved issue. Also, existing approaches have assumptions and limitations, which are not acceptable for safety applications. This paper proposes a theoretical approach combining software verification and mutation testing to quantify the software reliability in nuclear safety systems. The theoretical results obtained suggest that the software reliability depends on three factors: the test adequacy, the amount of software verification carried out and the reusability of verified code in the software. The proposed approach may help regulators in licensing computer based safety systems in nuclear reactors.

  16. Rationality and drug use: an experimental approach.

    Science.gov (United States)

    Blondel, Serge; Lohéac, Youenn; Rinaudo, Stéphane

    2007-05-01

    In rational addiction theory, higher discount rates encourage drug use. We test this hypothesis in the general framework of rationality and behaviour under risk. We do so using an experimental design with real monetary incentives. The decisions of 34 drug addicts are compared with those of a control group. The decisions of drug users (DU) are not any less consistent with standard theories of behaviour over time and under risk. Further, there is no difference in the estimated discount rate between drug users and the control group, but the former do appear to be more risk-seeking.

  17. Gas phase reactive collisions, experimental approach

    Directory of Open Access Journals (Sweden)

    Canosa A.

    2012-01-01

    Full Text Available Since 1937 when the first molecule in space has been identified, more than 150 molecules have been detected. Understanding the fate of these molecules requires having a perfect view of their photochemistry and reactivity with other partners. It is then crucial to identify the main processes that will produce and destroy them. In this chapter, a general view of experimental techniques able to deliver gas phase chemical kinetics data at low and very low temperatures will be presented. These techniques apply to the study of reactions between neutral reactants on the one hand and reactions involving charge species on the other hand.

  18. Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls

    Science.gov (United States)

    Guha Ray, A.; Baidya, D. K.

    2012-09-01

    Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.

  19. Flaw shape reconstruction – an experimental approach

    Directory of Open Access Journals (Sweden)

    Marilena STANCULESCU

    2009-05-01

    Full Text Available Flaws can be classified as acceptable and unacceptable flaws. As a result of nondestructive testing, one takes de decision Admit/Reject regarding the tested product related to some acceptability criteria. In order to take the right decision, one should know the shape and the dimension of the flaw. On the other hand, the flaws considered to be acceptable, develop in time, such that they can become unacceptable. In this case, the knowledge of the shape and dimension of the flaw allows determining the product time life. For interior flaw shape reconstruction the best procedure is the use of difference static magnetic field. We have a stationary magnetic field problem, but we face the problem given by the nonlinear media. This paper presents the results of the experimental work for control specimen with and without flaw.

  20. Life cycle reliability assessment of new products—A Bayesian model updating approach

    International Nuclear Information System (INIS)

    Peng, Weiwen; Huang, Hong-Zhong; Li, Yanfeng; Zuo, Ming J.; Xie, Min

    2013-01-01

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  1. Experimental approach to fission process of actinides

    Energy Technology Data Exchange (ETDEWEB)

    Baba, Hiroshi [Osaka Univ., Toyonaka (Japan). Faculty of Science

    1997-07-01

    From experimental views, it seems likely that the mechanism of nuclear fission process remains unsolved even after the Bohr and Weeler`s study in 1939. Especially, it is marked in respect of mass distribution in unsymmetric nuclear fission. The energy dependency of mass distribution can be explained with an assumption of 2-mode nuclear fission. Further, it was demonstrated that the symmetrical fission components and the unsymmetrical ones have different saddle and fission points. Thus, the presence of the 2-mode fission mechanism was confirmed. Here, transition in the nuclear fission mechanism and its cause were investigated here. As the cause of such transition, plausible four causes; a contribution of multiple-chance fission, disappearance of shell effects, beginning of fission following collective excitation due to GDR and nuclear phase transition were examined in the condition of excitation energy of 14.0 MeV. And it was suggested that the transition in the nuclear fission concerned might be related to phase transition. In addition, the mechanism of nuclear fission at a low energy and multi-mode hypothesis were examined by determination of the energy for thermal neutron fission ({sup 233,235}U and {sup 239}Pu) and spontaneous nuclear fission ({sup 252}Cf). (M.N.)

  2. A new approach for reliability analysis with time-variant performance characteristics

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2013-01-01

    Reliability represents safety level in industry practice and may variant due to time-variant operation condition and components deterioration throughout a product life-cycle. Thus, the capability to perform time-variant reliability analysis is of vital importance in practical engineering applications. This paper presents a new approach, referred to as nested extreme response surface (NERS), that can efficiently tackle time dependency issue in time-variant reliability analysis and enable to solve such problem by easily integrating with advanced time-independent tools. The key of the NERS approach is to build a nested response surface of time corresponding to the extreme value of the limit state function by employing Kriging model. To obtain the data for the Kriging model, the efficient global optimization technique is integrated with the NERS to extract the extreme time responses of the limit state function for any given system input. An adaptive response prediction and model maturation mechanism is developed based on mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-variant reliability analysis can be converted into the time-independent reliability analysis and existing advanced reliability analysis methods can be used. Three case studies are used to demonstrate the efficiency and accuracy of NERS approach

  3. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  4. A rule induction approach to improve Monte Carlo system reliability assessment

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.

    2003-01-01

    A Decision Tree (DT) approach to build empirical models for use in Monte Carlo reliability evaluation is presented. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replacing the Evaluation Function (EF) by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated with two systems of different size, represented by their equivalent networks. The robustness of the DT approach as an approximated method to replace the EF is also analysed. Excellent system reliability results are obtained by training a DT with a small amount of information

  5. Pitting corrosion and structural reliability of corroding RC structures: Experimental data and probabilistic analysis

    International Nuclear Information System (INIS)

    Stewart, Mark G.; Al-Harthy, Ali

    2008-01-01

    A stochastic analysis is developed to assess the temporal and spatial variability of pitting corrosion on the reliability of corroding reinforced concrete (RC) structures. The structure considered herein is a singly reinforced RC beam with Y16 or Y27 reinforcing bars. Experimental data obtained from corrosion tests are used to characterise the probability distribution of pit depth. The RC beam is discretised into a series of small elements and maximum pit depths are generated for each reinforcing steel bar in each element. The loss of cross-sectional area, reduction in yield strength and reduction in flexural resistance are then inferred. The analysis considers various member spans, loading ratios, bar diameters and numbers of bars in a given cross-section, and moment diagrams. It was found that the maximum corrosion loss in a reinforcing bar conditional on beam collapse was no more than 16%. The probabilities of failure considering spatial variability of pitting corrosion were up to 200% higher than probabilities of failure obtained from a non-spatial analysis after 50 years of corrosion. This shows the importance of considering spatial variability in a structural reliability analysis for deteriorating structures, particularly for corroding RC beams in flexure

  6. A novel ontology approach to support design for reliability considering environmental effects.

    Science.gov (United States)

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  7. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  8. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  9. Development of a Reliability Program approach to assuring operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques used in other high technology industries is being formulated for potential application in the nuclear power industry. Research findings are discussed. The reliability methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed with several reliability concepts (e.g., quantitative reliability goals, reliability centered maintenance) appearing to be directly transferable. Other tasks in the RP development effort involved the benchmarking and evaluation of the existing nuclear regulations and practices relevant to safety/reliability integration. A review of current risk-dominant issues was also conducted using results from existing probabilistic risk assessment studies. The ongoing RP development tasks have concentrated on defining a RP for the operating phase of a nuclear plant's lifecycle. The RP approach incorporates safety systems risk/reliability analysis and performance monitoring activities with dedicated tasks that integrate these activities with operating, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the RP

  10. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  11. Mechanical behaviour of the heel pad: experimental and numerical approach

    DEFF Research Database (Denmark)

    Matteoli, Sara; Fontanella, C. G.; Virga, A.

    The aim of the present work was to investigate the stress relaxation phenomena of the heel pad region under different loading conditions. A 31-year-old healthy female was enrolled in this study and her left foot underwent both MRI and experimental compression tests. Experimental results were...... compared with those obtained from finite element analysis performed on numerical 3D subject-specific heel pad model built on the basis of MRI. The calcaneal fat pad tissue was described with a visco-hyperelastic model, while a fiber-reinforced hyperelastic model was formulated for the skin. The reliability...

  12. An integrated approach to estimate storage reliability with initial failures based on E-Bayesian estimates

    International Nuclear Information System (INIS)

    Zhang, Yongjin; Zhao, Ming; Zhang, Shitao; Wang, Jiamei; Zhang, Yanjun

    2017-01-01

    An integrated approach to estimate the storage reliability is proposed. • A non-parametric measure to estimate the number of failures and the reliability at each testing time is presented. • E-Baysian method to estimate the failure probability is introduced. • The possible initial failures in storage are introduced. • The non-parametric estimates of failure numbers can be used into the parametric models.

  13. Reliability-redundancy optimization by means of a chaotic differential evolution approach

    International Nuclear Information System (INIS)

    Coelho, Leandro dos Santos

    2009-01-01

    The reliability design is related to the performance analysis of many engineering systems. The reliability-redundancy optimization problems involve selection of components with multiple choices and redundancy levels that produce maximum benefits, can be subject to the cost, weight, and volume constraints. Classical mathematical methods have failed in handling nonconvexities and nonsmoothness in optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solution in reliability-redundancy optimization problems. Evolutionary algorithms (EAs) - paradigms of evolutionary computation field - are stochastic and robust meta-heuristics useful to solve reliability-redundancy optimization problems. EAs such as genetic algorithm, evolutionary programming, evolution strategies and differential evolution are being used to find global or near global optimal solution. A differential evolution approach based on chaotic sequences using Lozi's map for reliability-redundancy optimization problems is proposed in this paper. The proposed method has a fast convergence rate but also maintains the diversity of the population so as to escape from local optima. An application example in reliability-redundancy optimization based on the overspeed protection system of a gas turbine is given to show its usefulness and efficiency. Simulation results show that the application of deterministic chaotic sequences instead of random sequences is a possible strategy to improve the performance of differential evolution.

  14. Managing Cybersecurity Research and Experimental Development: The REVO Approach

    OpenAIRE

    Dan Craigen; Drew Vandeth; D’Arcy Walsh

    2013-01-01

    We present a systematic approach for managing a research and experimental development cybersecurity program that must be responsive to continuously evolving cybersecurity, and other, operational concerns. The approach will be of interest to research-program managers, academe, corporate leads, government leads, chief information officers, chief technology officers, and social and technology policy analysts. The approach is compatible with international standards and procedures published by the...

  15. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  16. Application of safety and reliability approaches in the power sector: Inside-sectoral overview

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    This chapter summarizes the state-of-the-art and state-of-practice on the applications of safety and reliability approaches in the Power Sector. The nature and composition of this industrial sector including the characteristics of major hazards are summarized. The present situation with regard...... to a number of key technical aspects involved in the use of safety and reliability approaches in the power sector is discussed. Based on this review a Technology Maturity Matrix is synthesized. Barriers to the wider use of risk and reliability methods in the design and operation of power installations...... are identified and possible ways of overcoming these barriers are suggested. Key issues and priorities for research are identified....

  17. Semiconductor laser engineering, reliability and diagnostics a practical approach to high power and single mode devices

    CERN Document Server

    Epperlein, Peter W

    2013-01-01

    This reference book provides a fully integrated novel approach to the development of high-power, single-transverse mode, edge-emitting diode lasers by addressing the complementary topics of device engineering, reliability engineering and device diagnostics in the same book, and thus closes the gap in the current book literature. Diode laser fundamentals are discussed, followed by an elaborate discussion of problem-oriented design guidelines and techniques, and by a systematic treatment of the origins of laser degradation and a thorough exploration of the engineering means to enhance the optical strength of the laser. Stability criteria of critical laser characteristics and key laser robustness factors are discussed along with clear design considerations in the context of reliability engineering approaches and models, and typical programs for reliability tests and laser product qualifications. Novel, advanced diagnostic methods are reviewed to discuss, for the first time in detail in book literature, performa...

  18. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    can be used to understand how an amino acid change affects the protein. The experimental methods that provide the most detailed structural information on proteins are X-ray crystallography and NMR spectroscopy. However, these methods are labor intensive and currently cannot be carried out on a genomic scale. Nonetheless, Structural Genomics projects are being pursued by more than a dozen groups and consortia worldwide and as a result the number of experimentally determined structures is rising exponentially. Based on the expectation that protein structures will continue to be determined at an ever-increasing rate, reliable structure prediction schemes will become increasingly valuable, leading to information on protein function and disease for many different proteins. Given known genetic variability and experimentally determined protein structures, can we accurately predict the effects of single amino acid substitutions? An objective assessment of this question would involve comparing predicted and experimentally determined structures, which thus far has not been rigorously performed. The completed research leveraged existing expertise at LLNL in computational and structural biology, as well as significant computing resources, to address this question.

  19. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  20. Reliability and vulnerability analyses of critical infrastructures: Comparing two approaches in the context of power systems

    International Nuclear Information System (INIS)

    Johansson, Jonas; Hassel, Henrik; Zio, Enrico

    2013-01-01

    Society depends on services provided by critical infrastructures, and hence it is important that they are reliable and robust. Two main approaches for gaining knowledge required for designing and improving critical infrastructures are reliability analysis and vulnerability analysis. The former analyses the ability of the system to perform its intended function; the latter analyses its inability to withstand strains and the effects of the consequent failures. The two approaches have similarities but also some differences with respect to what type of information they generate about the system. In this view, the main purpose of this paper is to discuss and contrast these approaches. To strengthen the discussion and exemplify its findings, a Monte Carlo-based reliability analysis and a vulnerability analysis are considered in their application to a relatively simple, but representative, system the IEEE RTS96 electric power test system. The exemplification reveals that reliability analysis provides a good picture of the system likely behaviour, but fails to capture a large portion of the high consequence scenarios, which are instead captured in the vulnerability analysis. Although these scenarios might be estimated to have small probabilities of occurrence, they should be identified, considered and treated cautiously, as probabilistic analyses should not be the only input to decision-making for the design and protection of critical infrastructures. The general conclusion that can be drawn from the findings of the example is that vulnerability analysis should be used to complement reliability studies, as well as other forms of probabilistic risk analysis. Measures should be sought for reducing both the vulnerability, i.e. improving the system ability to withstand strains and stresses, and the reliability, i.e. improving the likely behaviour

  1. Experimental approach to high power long duration neutral beams

    International Nuclear Information System (INIS)

    Horiike, Hiroshi

    1981-12-01

    Experimental studies of ion sources and beam dumps for the development of a high power long duration neutral beam injector for JT-60 are presented. Long pulse operation of high power beams requires a high degree of reliability. To develop a reliable ion source with large extraction area, a new duoPIGatron ion source with a coaxially shaped intermediate electrode is proposed and tested. Magnetic configuration is examined numerically to obtain high current arc discharge and source plasma with small density variation. Experimental results show that primary electrons were fed widely from the cathode plasma region to the source plasma region and that dense uniform source plasma could be obtained easily. Source plasma characteristics are studied and comparison of these with other sources are also described. To develop extraction electrode of high power ion source, experimental studies were made on the cooling of the electrode. Long Pulse beams were extracted safely under the condition of high heat loading on the electrode. Finally, burnout study for the development of high power beam dumps is presented. Burnout data were obtained from subcooled forced-convective boiling of water in a copper finned tube irradiated by high power ion beams. The results yield simple burnout correlations which can be used for the prediction of burnout heat flux of the beam dump. (author)

  2. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  3. Reliability improvement in GaN HEMT power device using a field plate approach

    Science.gov (United States)

    Wu, Wen-Hao; Lin, Yueh-Chin; Chin, Ping-Chieh; Hsu, Chia-Chieh; Lee, Jin-Hwa; Liu, Shih-Chien; Maa, Jer-shen; Iwai, Hiroshi; Chang, Edward Yi; Hsu, Heng-Tung

    2017-07-01

    This study investigates the effect of implementing a field plate on a GaN high-electron-mobility transistor (HEMT) to improve power device reliability. The results indicate that the field plate structure reduces the peak electrical field and interface traps in the device, resulting in higher breakdown voltage, lower leakage current, smaller current collapse, and better threshold voltage control. Furthermore, after high voltage stress, steady dynamic on-resistance and gate capacitance degradation improvement were observed for the device with the field plate. This demonstrates that GaN device reliability can be improved by using the field plate approach.

  4. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  5. Development of slim-maud: a multi-attribute utility approach to human reliability evaluation

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1984-01-01

    This paper describes further work on the Success Likelihood Index Methodology (SLIM), a procedure for quantitatively evaluating human reliability in nuclear power plants and other systems. SLIM was originally developed by Human Reliability Associates during an earlier contract with Brookhaven National Laboratory (BNL). A further development of SLIM, SLIM-MAUD (Multi-Attribute Utility Decomposition) is also described. This is an extension of the original approach using an interactive, computer-based system. All of the work described in this report was supported by the Human Factors and Safeguards Branch of the US Nuclear Regulatory Commission

  6. Experimental approaches for studying non-equilibrium atmospheric plasma jets

    Energy Technology Data Exchange (ETDEWEB)

    Shashurin, A., E-mail: ashashur@purdue.edu [School of Aeronautics & Astronautics, Purdue University, West Lafayette, Indiana 47907 (United States); Keidar, M. [Department of Mechanical and Aerospace Engineering, The George Washington University, Washington, District of Columbia 20052 (United States)

    2015-12-15

    This work reviews recent research efforts undertaken in the area non-equilibrium atmospheric plasma jets with special focus on experimental approaches. Physics of small non-equilibrium atmospheric plasma jets operating in kHz frequency range at powers around few Watts will be analyzed, including mechanism of breakdown, process of ionization front propagation, electrical coupling of the ionization front with the discharge electrodes, distributions of excited and ionized species, discharge current spreading, transient dynamics of various plasma parameters, etc. Experimental diagnostic approaches utilized in the field will be considered, including Rayleigh microwave scattering, Thomson laser scattering, electrostatic streamer scatterers, optical emission spectroscopy, fast photographing, etc.

  7. The experimental and shell model approach to 100Sn

    International Nuclear Information System (INIS)

    Grawe, H.; Maier, K.H.; Fitzgerald, J.B.; Heese, J.; Spohr, K.; Schubart, R.; Gorska, M.; Rejmund, M.

    1995-01-01

    The present status of experimental approach to 100 Sn and its shell model structure is given. New developments in experimental techniques, such as low background isomer spectroscopy and charged particle detection in 4π are surveyed. Based on recent experimental data shell model calculations are used to predict the structure of the single- and two-nucleon neighbours of 100 Sn. The results are compared to the systematic of Coulomb energies and spin-orbit splitting and discussed with respect to future experiments. (author). 51 refs, 11 figs, 1 tab

  8. Reliability of objects in aerospace technologies and beyond: Holistic risk management approach

    Science.gov (United States)

    Shai, Yair; Ingman, D.; Suhir, E.

    A “ high level” , deductive-reasoning-based (“ holistic” ), approach is aimed at the direct analysis of the behavior of a system as a whole, rather than with an attempt to understand the system's behavior by conducting first a “ low level” , inductive-reasoning-based, analysis of the behavior and the contributions of the system's elements. The holistic view on treatment is widely accepted in medical practice, and “ holistic health” concept upholds that all the aspects of people's needs (psychological, physical or social), should be seen as a whole, and that a disease is caused by the combined effect of physical, emotional, spiritual, social and environmental imbalances. Holistic reasoning is applied in our analysis to model the behavior of engineering products (“ species” ) subjected to various economic, marketing, and reliability “ health” factors. Vehicular products (cars, aircraft, boats, etc.), e.g., might be still robust enough, but could be out-of-date, or functionally obsolete, or their further use might be viewed as unjustifiably expensive. High-level-performance functions (HLPF) are the essential feature of the approach. HLPFs are, in effect, “ signatures” of the “ species” of interest. The HLPFs describe, in a “ holistic” , and certainly in a probabilistic, way, numerous complex multi-dependable relations among the representatives of the “ species” under consideration. ; umerous inter-related “ stresses” , both actual (“ physical” ) and nonphysical, which affect the probabilistic predictions are inherently being taken into account by the HLPFs. There is no need, and might even be counter-productive, to conduct tedious, time- and labor-consuming experimentations and to invest significant amount of time and resources to accumulate “ representative statistics” to predict - he governing probabilistic characteristics of the system behavior, such as, e.g., life expectancy of a particular type of products.

  9. Beyond reliability, multi-state failure analysis of satellite subsystems: A statistical approach

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Reliability is widely recognized as a critical design attribute for space systems. In recent articles, we conducted nonparametric analyses and Weibull fits of satellite and satellite subsystems reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we extend our investigation of failures of satellites and satellite subsystems beyond the binary concept of reliability to the analysis of their anomalies and multi-state failures. In reliability analysis, the system or subsystem under study is considered to be either in an operational or failed state; multi-state failure analysis introduces 'degraded states' or partial failures, and thus provides more insights through finer resolution into the degradation behavior of an item and its progression towards complete failure. The database used for the statistical analysis in the present work identifies five states for each satellite subsystem: three degraded states, one fully operational state, and one failed state (complete failure). Because our dataset is right-censored, we calculate the nonparametric probability of transitioning between states for each satellite subsystem with the Kaplan-Meier estimator, and we derive confidence intervals for each probability of transitioning between states. We then conduct parametric Weibull fits of these probabilities using the Maximum Likelihood Estimation (MLE) approach. After validating the results, we compare the reliability versus multi-state failure analyses of three satellite subsystems: the thruster/fuel; the telemetry, tracking, and control (TTC); and the gyro/sensor/reaction wheel subsystems. The results are particularly revealing of the insights that can be gleaned from multi-state failure analysis and the deficiencies, or blind spots, of the traditional reliability analysis. In addition to the specific results provided here, which should prove particularly useful to the space industry, this work highlights the importance

  10. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos

    2009-01-01

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature

  11. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br

    2009-04-15

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature.

  12. What Shapes the Intention to Study Abroad? An Experimental Approach

    Science.gov (United States)

    Petzold, Knut; Moog, Petra

    2018-01-01

    In contrast to previous studies, this investigation aims to get deeper insights into the causes of the intention to study abroad by using an experimental approach. Although international experience is often considered as important, many students at German universities do not even consider abroad. Referring to the Theory of Rational Choice (RCT)…

  13. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)], e-mail: vasconv@cdtn.br, e-mail: silvaem@cdtn.br, e-mail: aclc@cdtn.br, e-mail: reissc@cdtn.br

    2009-07-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  14. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos

    2009-01-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  15. Stability of nanofluids: Molecular dynamic approach and experimental study

    International Nuclear Information System (INIS)

    Farzaneh, H.; Behzadmehr, A.; Yaghoubi, M.; Samimi, A.; Sarvari, S.M.H.

    2016-01-01

    Highlights: • Nanofluid stability is investigated and discussed. • A molecular dynamic approach, considering different forces on the nanoparticles, is adopted. • Stability diagrams are presented for different thermo-fluid conditions. • An experimental investigation is carried out to confirm the theoretical approach. - Abstract: Nanofluids as volumetric absorbent in solar energy conversion devices or as working fluid in different heat exchangers have been proposed by various researchers. However, dispersion stability of nanofluids is an important issue that must be well addressed before any industrial applications. Conditions such as severe temperature gradient, high temperature of heat transfer fluid, nanoparticle mean diameters and types of nanoparticles and base fluid are among the most effective parameters on the stability of nanofluid. A molecular dynamic approach, considering kinetic energy of nanoparticles and DLVO potential energy between nanoparticles, is adopted to study the nanofluid stability for different nanofluids at different working conditions. Different forces such as Brownian, thermophoresis, drag and DLVO are considered to introduce the stability diagrams. The latter presents the conditions for which a nanofluid can be stable. In addition an experimental investigation is carried out to find a stable nanofluid and to show the validity of the theoretical approach. There is a good agreement between the experimental and theoretical results that confirms the validity of our theoretical approach.

  16. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  17. Design for Six Sigma: Approach for reliability and low-cost manufacturing.

    Directory of Open Access Journals (Sweden)

    Jesus Gerardo Cruz Alvarez

    2012-07-01

    Full Text Available The aim of this study is to discuss new product development based on a traditional stage-gate process and to examine how new product development [NPD] tools, such as lean design for Six Sigma, can accelerate the achievement of the main goals of NPD: reliable product quality, cost-effective implementation, and desired time-to-market. These new tools must be incorporated into a new approach to NPD based on the Advanced Product and Quality Planning methodology.

  18. Approaches of data combining for reliability assessments with taking into account the priority of data application

    International Nuclear Information System (INIS)

    Zelenyj, O.V.; Pecheritsa, A.V.

    2004-01-01

    Based upon the available experience on assessments of risk from Ukrainian NPP's operational events as well as on results of State review of PSA studies for pilot units it should be noted that historical information on domestic NPP's operation is not always available or used properly under implementation of mentioned activities. The several approaches for combining of available generic and specific information for reliability parameters assessment (taking into account the priority of data application) are briefly described in the article along with some recommendations how to apply those approaches

  19. Different Approaches for Ensuring Performance/Reliability of Plastic Encapsulated Microcircuits (PEMs) in Space Applications

    Science.gov (United States)

    Gerke, R. David; Sandor, Mike; Agarwal, Shri; Moor, Andrew F.; Cooper, Kim A.

    2000-01-01

    Engineers within the commercial and aerospace industries are using trade-off and risk analysis to aid in reducing spacecraft system cost while increasing performance and maintaining high reliability. In many cases, Commercial Off-The-Shelf (COTS) components, which include Plastic Encapsulated Microcircuits (PEMs), are candidate packaging technologies for spacecrafts due to their lower cost, lower weight and enhanced functionality. Establishing and implementing a parts program that effectively and reliably makes use of these potentially less reliable, but state-of-the-art devices, has become a significant portion of the job for the parts engineer. Assembling a reliable high performance electronic system, which includes COTS components, requires that the end user assume a risk. To minimize the risk involved, companies have developed methodologies by which they use accelerated stress testing to assess the product and reduce the risk involved to the total system. Currently, there are no industry standard procedures for accomplishing this risk mitigation. This paper will present the approaches for reducing the risk of using PEMs devices in space flight systems as developed by two independent Laboratories. The JPL procedure involves primarily a tailored screening with accelerated stress philosophy while the APL procedure is primarily, a lot qualification procedure. Both Laboratories successfully have reduced the risk of using the particular devices for their respective systems and mission requirements.

  20. Reliability assessment of serviceability performance of braced retaining walls using a neural network approach

    Science.gov (United States)

    Goh, A. T. C.; Kulhawy, F. H.

    2005-05-01

    In urban environments, one major concern with deep excavations in soft clay is the potentially large ground deformations in and around the excavation. Excessive movements can damage adjacent buildings and utilities. There are many uncertainties associated with the calculation of the ultimate or serviceability performance of a braced excavation system. These include the variabilities of the loadings, geotechnical soil properties, and engineering and geometrical properties of the wall. A risk-based approach to serviceability performance failure is necessary to incorporate systematically the uncertainties associated with the various design parameters. This paper demonstrates the use of an integrated neural network-reliability method to assess the risk of serviceability failure through the calculation of the reliability index. By first performing a series of parametric studies using the finite element method and then approximating the non-linear limit state surface (the boundary separating the safe and failure domains) through a neural network model, the reliability index can be determined with the aid of a spreadsheet. Two illustrative examples are presented to show how the serviceability performance for braced excavation problems can be assessed using the reliability index.

  1. The neural correlates of consciousness: new experimental approaches needed?

    Science.gov (United States)

    Hohwy, Jakob

    2009-06-01

    It appears that consciousness science is progressing soundly, in particular in its search for the neural correlates of consciousness. There are two main approaches to this search, one is content-based (focusing on the contrast between conscious perception of, e.g., faces vs. houses), the other is state-based (focusing on overall conscious states, e.g., the contrast between dreamless sleep vs. the awake state). Methodological and conceptual considerations of a number of concrete studies show that both approaches are problematic: the content-based approach seems to set aside crucial aspects of consciousness; and the state-based approach seems over-inclusive in a way that is hard to rectify without losing sight of the crucial conscious-unconscious contrast. Consequently, the search for the neural correlates of consciousness is in need of new experimental paradigms.

  2. A New Approach for Reliability Life Prediction of Rail Vehicle Axle by Considering Vibration Measurement

    Directory of Open Access Journals (Sweden)

    Meral Bayraktar

    2014-01-01

    Full Text Available The effect of vibration on the axle has been considered. Vibration measurements at different speeds have been performed on the axle of a running rail vehicle to figure out displacement, acceleration, time, and frequency response. Based on the experimental works, equivalent stress has been used to find out life of the axles for 90% and 10% reliability. Calculated life values of the rail vehicle axle have been compared with the real life data and it is found that the life of a vehicle axle taking into account the vibration effects is in good agreement with the real life of the axle.

  3. Managing Cybersecurity Research and Experimental Development: The REVO Approach

    Directory of Open Access Journals (Sweden)

    Dan Craigen

    2013-07-01

    Full Text Available We present a systematic approach for managing a research and experimental development cybersecurity program that must be responsive to continuously evolving cybersecurity, and other, operational concerns. The approach will be of interest to research-program managers, academe, corporate leads, government leads, chief information officers, chief technology officers, and social and technology policy analysts. The approach is compatible with international standards and procedures published by the Organisation for Economic Co-operation and Development (OECD and the Treasury Board of Canada Secretariat (TBS. The key benefits of the approach are the following: i the breadth of the overall (cybersecurity space is described; ii depth statements about specific (cybersecurity challenges are articulated and mapped to the breadth of the problem; iii specific (cybersecurity initiatives that have been resourced through funding or personnel are tracked and linked to specific challenges; and iv progress is assessed through key performance indicators. Although we present examples from cybersecurity, the method may be transferred to other domains. We have found the approach to be rigorous yet adaptive to change; it challenges an organization to be explicit about the nature of its research and experimental development in a manner that fosters alignment with evolving business priorities, knowledge transfer, and partner engagement.

  4. Hydrodynamic cavitation: from theory towards a new experimental approach

    Science.gov (United States)

    Lucia, Umberto; Gervino, Gianpiero

    2009-09-01

    Hydrodynamic cavitation is analysed by a global thermodynamics principle following an approach based on the maximum irreversible entropy variation that has already given promising results for open systems and has been successfully applied in specific engineering problems. In this paper we present a new phenomenological method to evaluate the conditions inducing cavitation. We think this method could be useful in the design of turbo-machineries and related technologies: it represents both an original physical approach to cavitation and an economical saving in planning because the theoretical analysis could allow engineers to reduce the experimental tests and the costs of the design process.

  5. Commutative and Non-commutative Parallelogram Geometry: an Experimental Approach

    OpenAIRE

    Bertram, Wolfgang

    2013-01-01

    By "parallelogram geometry" we mean the elementary, "commutative", geometry corresponding to vector addition, and by "trapezoid geometry" a certain "non-commutative deformation" of the former. This text presents an elementary approach via exercises using dynamical software (such as geogebra), hopefully accessible to a wide mathematical audience, from undergraduate students and high school teachers to researchers, proceeding in three steps: (1) experimental geometry, (2) algebra (linear algebr...

  6. Decision Making under Ecological Regime Shift: An Experimental Economic Approach

    OpenAIRE

    Kawata, Yukichika

    2011-01-01

    Environmental economics postulates the assumption of homo economicus and presumes that externality occurs as a result of the rational economic activities of economic agents. This paper examines this assumption using an experimental economic approach in the context of regime shift, which has been receiving increasing attention. We observe that when externality does not exist, economic agents (subjects of experimemt) act economically rationally, but when externality exists, economic agents avoi...

  7. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  8. Friction force experimental approach in High School Physics classes

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Alvarenga Monteiro

    2012-12-01

    Full Text Available http://dx.doi.org/10.5007/2175-7941.2012v29n3p1121 In this paper we propose and describe the performance of an experimental activity to address the concept of friction in High School Physics practical classes. We use a low-cost and simple construction device that enables the determination of the coefficient of static friction between two materials through three different procedures. The results were coherent, with small percentage deviation, which gives reliability to the activity and can stimulate discussions in class. The activity also allows greater contextualization of concepts that are usually discussed only theoretically, requiring a higher abstraction level of the students. This can stimulate discussions and greater interaction between teacher and students.

  9. The Bayes linear approach to inference and decision-making for a reliability programme

    International Nuclear Information System (INIS)

    Goldstein, Michael; Bedford, Tim

    2007-01-01

    In reliability modelling it is conventional to build sophisticated models of the probabilistic behaviour of the component lifetimes in a system in order to deduce information about the probabilistic behaviour of the system lifetime. Decision modelling of the reliability programme requires a priori, therefore, an even more sophisticated set of models in order to capture the evidence the decision maker believes may be obtained from different types of data acquisition. Bayes linear analysis is a methodology that uses expectation rather than probability as the fundamental expression of uncertainty. By working only with expected values, a simpler level of modelling is needed as compared to full probability models. In this paper we shall consider the Bayes linear approach to the estimation of a mean time to failure MTTF of a component. The model built will take account of the variance in our estimate of the MTTF, based on a variety of sources of information

  10. Reliability modeling of a hard real-time system using the path-space approach

    International Nuclear Information System (INIS)

    Kim, Hagbae

    2000-01-01

    A hard real-time system, such as a fly-by-wire system, fails catastrophically (e.g. losing stability) if its control inputs are not updated by its digital controller computer within a certain timing constraint called the hard deadline. To assess and validate those systems' reliabilities by using a semi-Markov model that explicitly contains the deadline information, we propose a path-space approach deriving the upper and lower bounds of the probability of system failure. These bounds are derived by using only simple parameters, and they are especially suitable for highly reliable systems which should recover quickly. Analytical bounds are derived for both exponential and Wobble failure distributions encountered commonly, which have proven effective through numerical examples, while considering three repair strategies: repair-as-good-as-new, repair-as-good-as-old, and repair-better-than-old

  11. USING A TOTAL QUALITY STRATEGY IN A NEW PRACTICAL APPROACH FOR IMPROVING THE PRODUCT RELIABILITY IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Cristiano Fragassa

    2014-09-01

    Full Text Available In this paper a Total Quality Management strategy is proposed, refined and used with the aim at improving the quality of large-mass industrial products far beyond the technical specifications demanded at the end-customer level. This approach combines standard and non-standard tools used for Reliability, Availability and Maintainability analysis. The procedure also realizes a stricter correlation between theoretical evaluation methods and experimental evidences as part of a modern integrated method for strengthening quality in design and process. A commercial Intake Manifold, largely spread in the market, is used as test-case for the validation of the methodology. As general additional result, the research underlines the impact of Total Quality Management and its tools on the development of innovation.

  12. Approaches to determining the reliability of a multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2018-03-01

    Full Text Available The market of modern mobile applications has increasingly strict requirements for the authentication system reliability. This article examines an authentication method using a multimodal three-dimensional dynamic signature (MTDS, that can be used both as a main and additional method of user authentication in mobile applications. It is based on the use of gesture in the air performed by two independent mobile devices as an identifier. The MTDS method has certain advantages over currently used biometric methods, including fingerprint authentication, face recognition and voice recognition. A multimodal three-dimensional dynamic signature allows quickly changing an authentication gesture, as well as concealing the authentication procedure using gestures that do not attract attention. Despite all its advantages, the MTDS method has certain limitations, the main one is building functionally dynamic complex (FDC skills required for accurate repeating an authentication gesture. To correctly create MTDS need to have a system for assessing the reliability of gestures. Approaches to the solution of this task are grouped in this article according to methods of their implementation. Two of the approaches can be implemented only with the use of a server as a centralized MTDS processing center and one approach can be implemented using smartphone's own computing resources. The final part of the article provides data of testing one of these methods on a template performing the MTDS authentication.

  13. Orotracheal Intubation Using the Retromolar Space: A Reliable Alternative Intubation Approach to Prevent Dental Injury

    Directory of Open Access Journals (Sweden)

    Linh T. Nguyen

    2016-01-01

    Full Text Available Despite recent advances in airway management, perianesthetic dental injury remains one of the most common anesthesia-related adverse events and cause for malpractice litigation against anesthesia providers. Recommended precautions for prevention of dental damage may not always be effective because these techniques involve contact and pressure exerted on vulnerable teeth. We describe a novel approach using the retromolar space to insert a flexible fiberscope for tracheal tube placement as a reliable method to achieve atraumatic tracheal intubation. Written consent for publication has been obtained from the patient.

  14. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  15. Approaches to Demonstrating the Reliability and Validity of Core Diagnostic Criteria for Chronic Pain.

    Science.gov (United States)

    Bruehl, Stephen; Ohrbach, Richard; Sharma, Sonia; Widerstrom-Noga, Eva; Dworkin, Robert H; Fillingim, Roger B; Turk, Dennis C

    2016-09-01

    The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks-American Pain Society Pain Taxonomy (AAPT) is designed to be an evidence-based multidimensional chronic pain classification system that will facilitate more comprehensive and consistent chronic pain diagnoses, and thereby enhance research, clinical communication, and ultimately patient care. Core diagnostic criteria (dimension 1) for individual chronic pain conditions included in the initial version of AAPT will be the focus of subsequent empirical research to evaluate and provide evidence for their reliability and validity. Challenges to validating diagnostic criteria in the absence of clear and identifiable pathophysiological mechanisms are described. Based in part on previous experience regarding the development of evidence-based diagnostic criteria for psychiatric disorders, headache, and specific chronic pain conditions (fibromyalgia, complex regional pain syndrome, temporomandibular disorders, pain associated with spinal cord injuries), several potential approaches for documentation of the reliability and validity of the AAPT diagnostic criteria are summarized. The AAPT is designed to be an evidence-based multidimensional chronic pain classification system. Conceptual and methodological issues related to demonstrating the reliability and validity of the proposed AAPT chronic pain diagnostic criteria are discussed. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  16. Bayesian approach for the reliability assessment of corroded interdependent pipe networks

    International Nuclear Information System (INIS)

    Ait Mokhtar, El Hassene; Chateauneuf, Alaa; Laggoune, Radouane

    2016-01-01

    Pipelines under corrosion are subject to various environment conditions, and consequently it becomes difficult to build realistic corrosion models. In the present work, a Bayesian methodology is proposed to allow for updating the corrosion model parameters according to the evolution of environmental conditions. For reliability assessment of dependent structures, Bayesian networks are used to provide interesting qualitative and quantitative description of the information in the system. The qualitative contribution lies in the modeling of complex system, composed by dependent pipelines, as a Bayesian network. The quantitative one lies in the evaluation of the dependencies between pipelines by the use of a new method for the generation of conditional probability tables. The effectiveness of Bayesian updating is illustrated through an application where the new reliability of degraded (corroded) pipe networks is assessed. - Highlights: • A methodology for Bayesian network modeling of pipe networks is proposed. • Bayesian approach based on Metropolis - Hastings algorithm is conducted for corrosion model updating. • The reliability of corroded pipe network is assessed by considering the interdependencies between the pipelines.

  17. Experimental Study on Strain Reliability of Embroidered Passive UHF RFID Textile Tag Antennas and Interconnections

    Directory of Open Access Journals (Sweden)

    Xiaochen Chen

    2017-01-01

    Full Text Available We present embroidered antennas and interconnections in passive UHF RFID textile tags and test their strain reliability. Firstly, we fabricate tag antennas on two different stretchable fabric substrates by five different embroidery patterns and choose the most stretchable ones for testing. Next, the tag ICs are attached by sewing and gluing, and the tag reliability during repeated stretching cycles is evaluated through wireless measurements. Initially, the chosen tags achieve read ranges of 6–8 meters and can strain up to 140–150% of their original length. After 100 stretching cycles to 80% of their maximum strain, the read ranges of the tags with glued interconnections are similar to the initial values. In addition, also the read ranges of the tags with sewed interconnections are still more than 70%–85% of their initial values. However, some challenges with the reproducibility need to be solved next.

  18. Experimental approaches for evaluating the invasion risk of biofuel crops

    International Nuclear Information System (INIS)

    Luke Flory, S; Sollenberger, Lynn E; Lorentz, Kimberly A; Gordon, Doria R

    2012-01-01

    There is growing concern that non-native plants cultivated for bioenergy production might escape and result in harmful invasions in natural areas. Literature-derived assessment tools used to evaluate invasion risk are beneficial for screening, but cannot be used to assess novel cultivars or genotypes. Experimental approaches are needed to help quantify invasion risk but protocols for such tools are lacking. We review current methods for evaluating invasion risk and make recommendations for incremental tests from small-scale experiments to widespread, controlled introductions. First, local experiments should be performed to identify conditions that are favorable for germination, survival, and growth of candidate biofuel crops. Subsequently, experimental introductions in semi-natural areas can be used to assess factors important for establishment and performance such as disturbance, founder population size, and timing of introduction across variable habitats. Finally, to fully characterize invasion risk, experimental introductions should be conducted across the expected geographic range of cultivation over multiple years. Any field-based testing should be accompanied by safeguards and monitoring for early detection of spread. Despite the costs of conducting experimental tests of invasion risk, empirical screening will greatly improve our ability to determine if the benefits of a proposed biofuel species outweigh the projected risks of invasions. (letter)

  19. An Open Modelling Approach for Availability and Reliability of Systems - OpenMARS

    CERN Document Server

    Penttinen, Jussi-Pekka; Gutleber, Johannes

    2018-01-01

    This document introduces and gives specification for OpenMARS, which is an open modelling approach for availability and reliability of systems. It supports the most common risk assessment and operation modelling techniques. Uniquely OpenMARS allows combining and connecting models defined with different techniques. This ensures that a modeller has a high degree of freedom to accurately describe the modelled system without limitations imposed by an individual technique. Here the OpenMARS model definition is specified with a tool independent tabular format, which supports managing models developed in a collaborative fashion. Origin of our research is in Future Circular Collider (FCC) study, where we developed the unique features of our concept to model the availability and luminosity production of particle colliders. We were motivated to describe our approach in detail as we see potential further applications in performance and energy efficiency analyses of large scientific infrastructures or industrial processe...

  20. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    Energy Technology Data Exchange (ETDEWEB)

    Emery, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Coffin, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robbins, Brian A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carroll, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Field, Richard V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeremy Yoo, Yung Suk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kacher, Josh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins with a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.

  1. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    International Nuclear Information System (INIS)

    Dong, Feifei; Liu, Yong; Su, Han; Zou, Rui; Guo, Huaicheng

    2015-01-01

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  2. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: yongliu@pku.edu.cn [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)

    2015-05-15

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  3. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  4. Systematic review of survival time in experimental mouse stroke with impact on reliability of infarct estimation

    DEFF Research Database (Denmark)

    Klarskov, Carina Kirstine; Klarskov, Mikkel Buster; Hasseldam, Henrik

    2016-01-01

    infarcts with more substantial edema. Purpose: This paper will give an overview of previous studies of experimental mouse stroke, and correlate survival time to peak time of edema formation. Furthermore, investigations of whether the included studies corrected the infarct measurements for edema...... of reasons for the translational problems from mouse experimental stroke to clinical trials probably exists, including infarct size estimations around the peak time of edema formation. Furthermore, edema is a more prominent feature of stroke in mice than in humans, because of the tendency to produce larger...... of the investigated process. Our findings indicate a need for more research in this area, and establishment of common correction methodology....

  5. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  6. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  7. Olive pomace based lightweight concrete, an experimental approach and contribution

    Directory of Open Access Journals (Sweden)

    Lynda Amel Chaabane

    2018-01-01

    Full Text Available Due to conventional aggregates resources depletion, material recycling has become an economic and ecologic alternative. In this paper, locally available natural residues such as olive pomace were investigated, when partially incorporated in the concrete formulation, since the mechanical characteristics of lightweight aggregate concrete strongly depend on its properties and proportions. Lightweight aggregates are more deformable than the cement matrix because of their high porosity, and their influence on the concrete strength remains complex. The purpose of this paper is to investigate the aggregates properties on lightweight concrete mechanical behaviour through an experimental approach. In addition, the different substitution sequences and the W/C ratio on lightweight concrete behaviour were evaluated, in order to determine the W/C ratio influence on the improvement of the lightweight concrete mechanical properties while knowing that the mixing water quantity gives the cement paste manoeuvrability and mechanical strength effects. The last part of this paper, therefore, was to provide statistical survey for estimating strength and weight reduction through the different natural aggregate substitutions to improve the lightweight concrete properties. The results achieved in a significant olive-pomace lower adhesion with the matrix after the cement setting, making the lightweight concrete mechanical strength weak. However, this work can open several perspectives: Results modeling and correlation with an experimental approach, the evolution and determination of lightweight concrete characteristics when exposed to high temperatures and thermohydric properties.

  8. Energy requirements during sponge cake baking: Experimental and simulated approach

    International Nuclear Information System (INIS)

    Ureta, M. Micaela; Goñi, Sandro M.; Salvadori, Viviana O.; Olivera, Daniela F.

    2017-01-01

    Highlights: • Sponge cake energy consumption during baking was studied. • High oven temperature and forced convection mode favours oven energy savings. • Forced convection produced higher weight loss thus a higher product energy demand. • Product energy demand was satisfactorily estimated by the baking model applied. • The greatest energy efficiency corresponded to the forced convection mode. - Abstract: Baking is a high energy demanding process, which requires special attention in order to know and improve its efficiency. In this work, energy consumption associated to sponge cake baking is investigated. A wide range of operative conditions (two ovens, three convection modes, three oven temperatures) were compared. Experimental oven energy consumption was estimated taking into account the heating resistances power and a usage factor. Product energy demand was estimated from both experimental and modeling approaches considering sensible and latent heat. Oven energy consumption results showed that high oven temperature and forced convection mode favours energy savings. Regarding product energy demand, forced convection produced faster and higher weight loss inducing a higher energy demand. Besides, this parameter was satisfactorily estimated by the baking model applied, with an average error between experimental and simulated values in a range of 8.0–10.1%. Finally, the energy efficiency results indicated that it increased linearly with the effective oven temperature and that the greatest efficiency corresponded to the forced convection mode.

  9. Vanadium supersaturated silicon system: a theoretical and experimental approach

    Science.gov (United States)

    Garcia-Hemme, Eric; García, Gregorio; Palacios, Pablo; Montero, Daniel; García-Hernansanz, Rodrigo; Gonzalez-Diaz, Germán; Wahnon, Perla

    2017-12-01

    The effect of high dose vanadium ion implantation and pulsed laser annealing on the crystal structure and sub-bandgap optical absorption features of V-supersaturated silicon samples has been studied through the combination of experimental and theoretical approaches. Interest in V-supersaturated Si focusses on its potential as a material having a new band within the Si bandgap. Rutherford backscattering spectrometry measurements and formation energies computed through quantum calculations provide evidence that V atoms are mainly located at interstitial positions. The response of sub-bandgap spectral photoconductance is extended far into the infrared region of the spectrum. Theoretical simulations (based on density functional theory and many-body perturbation in GW approximation) bring to light that, in addition to V atoms at interstitial positions, Si defects should also be taken into account in explaining the experimental profile of the spectral photoconductance. The combination of experimental and theoretical methods provides evidence that the improved spectral photoconductance up to 6.2 µm (0.2 eV) is due to new sub-bandgap transitions, for which the new band due to V atoms within the Si bandgap plays an essential role. This enables the use of V-supersaturated silicon in the third generation of photovoltaic devices.

  10. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity

    NARCIS (Netherlands)

    Dawson, A.; Raphael, K.G.; Glaros, A.; Axelsson, S.; Arima, T.; Ernberg, M.; Farella, M.; Lobbezoo, F.; Manfredini, D.; Michelotti, A.; Svensson, P.; List, T.

    2013-01-01

    AIMS: To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. METHODS: Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity

  11. Assessment of anastomotic reliability with pulse oximetry in graded intestinal ischemia: an experimental study in dogs.

    Science.gov (United States)

    Türkyilmaz, Z; Sönmez, K; Başaklar, A C; Demiroğullari, B; Numanoğlu, V; Ekingen, G; Dursun, A; Altin, M A; Kale, N

    1997-12-01

    Pulse oximetry has been proposed as an appropriate and feasible technique in the assessment of intestinal ischemia in recent years. In this study the authors aimed to assess the reliability of anastomoses in the dog small intestine in which there is graded irreversible ischemia as measured by pulse oxymeter. In a control group of four dogs, without any devascularization, three small bowel anastomoses were formed in each dog. The study group consisted of 12 dogs. In each animal three intestinal segments with different levels of ischemia were created by ligating the marginal vessels proximally and distally in sequence beginning from the midpoint of the segmental vascular arcade. Preanastomotic pulse oximeter readings between 80% and 90% were assigned to mild ischemia, 70% and 80% to moderate, and 60% and 70% to severe ischemia group. Pulse oximetry measurements were obtained from probes applied to the antimesenteric serosal surfaces at the midpoint of small intestinal segments. A total of 48 intestinal segments (12 nonischemic in the control group and 36 with three different levels of ischemia in the study group) were transected in the midpoint and anastomosed in double layers. Postanastomotic SaO2 values were also noted. The anastomoses were evaluated 48 hours later macroscopically if there was any leakage, and biopsy specimens were obtained for histopathologic ischemic gradings. All results were studied statistically. Histopathologic grades between each group were statistically different (P .05), worsening as the level of ischemia increased. Pre- and postanastomotic pulse oximetry measurements correlated very well with the histological gradings (r = -0.90, P anastomoses) in severe ischemia groups. In the moderate ischemia group with an average preanastomotic pulse reading of 76.75%, each of the leaking anastomoses had a postanastomotic pulse measurement of lower than 70%. The finding that the difference between histopathologic grades of control and mild ischemia

  12. Features of applying systems approach for evaluating the reliability of cryogenic systems for special purposes

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. The analysis of cryogenic installations confirms objective regularity of increase in amount of the tasks solved by systems of a special purpose. One of the most important directions of development of a cryogenics is creation of installations for air separation product receipt, namely oxygen and nitrogen. Modern aviation complexes require use of these gases in large numbers as in gaseous, and in the liquid state. The onboard gas systems applied in aircraft of the Russian Federation are subdivided on: oxygen system; air (nitric system; system of neutral gas; fire-proof system. Technological schemes ADI are in many respects determined by pressure of compressed air or, in a general sense, a refrigerating cycle. For the majority ADI a working body of a refrigerating cycle the divided air is, that is technological and refrigerating cycles in installation are integrated. By this principle differentiate installations: low pressure; average and high pressure; with detander; with preliminary chilling. There is also insignificant number of the ADI types in which refrigerating and technological cycles are separated. These are installations with external chilling. For the solution of tasks of control of technical condition of the BRV hardware in real time and estimates of indicators of reliability it is offered to use multi-agent technologies. Multi-agent approach is the most acceptable for creation of SPPR for reliability assessment as allows: to redistribute processing of information on elements of system that leads to increase in overall performance; to solve a problem of accumulating, storage and recycling of knowledge that will allow to increase significantly efficiency of the solution of tasks of an assessment of reliability; to considerably reduce intervention of the person in process of functioning of system that will save time of the person of the making decision (PMD and will not demand from it special skills of work with it.

  13. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  14. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Science.gov (United States)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  15. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  16. Systamatic approach to integration of a human reliability analysis into a NPP probabalistic risk assessment

    International Nuclear Information System (INIS)

    Fragola, J.R.

    1984-01-01

    This chapter describes the human reliability analysis tasks which were employed in the evaluation of the overall probability of an internal flood sequence and its consequences in terms of disabling vulnerable risk significant equipment. Topics considered include the problem familiarization process, the identification and classification of key human interactions, a human interaction review of potential initiators, a maintenance and operations review, human interaction identification, quantification model selection, the definition of operator-induced sequences, the quantification of specific human interactions, skill- and rule-based interactions, knowledge-based interactions, and the incorporation of human interaction-related events into the event tree structure. It is concluded that an integrated approach to the analysis of human interaction within the context of a Probabilistic Risk Assessment (PRA) is feasible

  17. A Reliable, Non-Invasive Approach to Data Center Monitoring and Management

    Directory of Open Access Journals (Sweden)

    Moises Levy

    2017-08-01

    Full Text Available Recent standards, legislation, and best practices point to data center infrastructure management systems to control and monitor data center performance. This work presents an innovative approach to address some of the challenges that currently hinder data center management. It explains how monitoring and management systems should be envisioned and implemented. Key parameters associated with data center infrastructure and information technology equipment can be monitored in real-time across an entire facility using low-cost, low-power wireless sensors. Given the data centers’ mission critical nature, the system must be reliable and deployable through a non-invasive process. The need for the monitoring system is also presented through a feedback control systems perspective, which allows higher levels of automation. The data center monitoring and management system enables data gathering, analysis, and decision-making to improve performance, and to enhance asset utilization.

  18. Do intensity ratings and skin conductance responses reliably discriminate between different stimulus intensities in experimentally induced pain?

    Science.gov (United States)

    Breimhorst, Markus; Sandrock, Stephan; Fechir, Marcel; Hausenblas, Nadine; Geber, Christian; Birklein, Frank

    2011-01-01

    The present study addresses the question whether pain-intensity ratings and skin conductance responses (SCRs) are able to detect different intensities of phasic painful stimuli and to determine the reliability of this discrimination. For this purpose, 42 healthy participants of both genders were assigned to either electrical, mechanical, or laser heat-pain stimulation (each n = 14). A whole range of single brief painful stimuli were delivered on the right volar forearm of the dominant hand in a randomized order. Pain-intensity ratings and SCRs were analyzed. Using generalizability theory, individual and gender differences were the main contributors to the variability of both intensity ratings and SCRs. Most importantly, we showed that pain-intensity ratings are a reliable measure for the discrimination of different pain stimulus intensities in the applied modalities. The reliability of SCR was adequate when mechanical and heat stimuli were tested but failed for the discrimination of electrical stimuli. Further studies are needed to reveal the reason for this lack of accuracy for SCRs when applying electrical pain stimuli. Our study could help researchers to better understand the relationship between pain and activation of the sympathetic nervous system. Pain researchers are furthermore encouraged to consider individual and gender differences when measuring pain intensity and the concomitant SCRs in experimental settings. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  19. An experimental study of assessment of weld quality on fatigue reliability analysis of a nuclear pressure vessel

    International Nuclear Information System (INIS)

    Dai Shuhe

    1993-01-01

    The steam generator in PWR primary coolant system China of Qinshan Nuclear Power Plant is a crucial unit belonging to the category of nuclear pressure vessel. The purpose of this research work is to make an examination of the weld quality of the steam generator under fatigue loading and to assess its reliability by using the experimental results of fatigue test of material of nuclear pressure vessel S-271 (Chinese Standard) and of qualified tests of welded seams of a simulated prototype of bottom closure head of the steam generator. A guarantee of weld quality is proposed as a subsequent verification for China National Nuclear Safety Supervision Bureau. The results of reliability analysis reported in this work can be taken as a supplementary material of Probabilistic Safety Assessment (PSA) of Qinshan Nuclear Power Plant. According to the requirement of Provision II-1500 cyclic testing, ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Power Plant Components, a simulated prototype of the bottom closure head of the steam generator was made for qualified tests. To find the quantified results of reliability assessment by using the testing data, two proposals are presented

  20. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  1. Scram reliability under seismic conditions at the Experimental Breeder Reactor II

    International Nuclear Information System (INIS)

    Roglans, J.; Wang, C.Y.; Hill, D.J.

    1993-01-01

    A Probabilistic Risk Assessment of the Experimental Breeder Reactor II has recently been completed. Seismic events are among the external initiating events included in the assessment. As part of the seismic PRA a detailed study has been performed of the ability to shutdown the reactor under seismic conditions. A comprehensive finite element model of the EBR-II control rod drive system has been used to analyze the control rod system response when subjected to input seismic accelerators. The results indicate the control rod drive system has a high seismic capacity. The estimated seismic fragility for the overall reactor shutdown system is dominated by the primary tank failure

  2. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  3. Experimental oligopolies modeling: A dynamic approach based on heterogeneous behaviors

    Science.gov (United States)

    Cerboni Baiardi, Lorenzo; Naimzada, Ahmad K.

    2018-05-01

    In the rank of behavioral rules, imitation-based heuristics has received special attention in economics (see [14] and [12]). In particular, imitative behavior is considered in order to understand the evidences arising in experimental oligopolies which reveal that the Cournot-Nash equilibrium does not emerge as unique outcome and show that an important component of the production at the competitive level is observed (see e.g.[1,3,9] or [7,10]). By considering the pioneering groundbreaking approach of [2], we build a dynamical model of linear oligopolies where heterogeneous decision mechanisms of players are made explicit. In particular, we consider two different types of quantity setting players characterized by different decision mechanisms that coexist and operate simultaneously: agents that adaptively adjust their choices towards the direction that increases their profit are embedded with imitator agents. The latter ones use a particular form of proportional imitation rule that considers the awareness about the presence of strategic interactions. It is noteworthy that the Cournot-Nash outcome is a stationary state of our models. Our thesis is that the chaotic dynamics arousing from a dynamical model, where heterogeneous players are considered, are capable to qualitatively reproduce the outcomes of experimental oligopolies.

  4. A Novel Approach to Experimental Studies of Mineral Dissolution Kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Chen Zhu

    2006-08-31

    Currently, DOE is conducting pilot CO{sub 2} injection tests to evaluate the concept of geological sequestration. One strategy that potentially enhances CO{sub 2} solubility and reduces the risk of CO{sub 2} leak back to the surface is dissolution of indigenous minerals in the geological formation and precipitation of secondary carbonate phases, which increases the brine pH and immobilizes CO{sub 2}. Clearly, the rates at which these dissolution and precipitation reactions occur directly determine the efficiency of this strategy. However, one of the fundamental problems in modern geochemistry is the persistent two to five orders of magnitude discrepancy between laboratory measured and field derived feldspar dissolution rates. To date, there is no real guidance as to how to predict silicate reaction rates for use in quantitative models. Current models for assessment of geological carbon sequestration have generally opted to use laboratory rates, in spite of the dearth of such data for compositionally complex systems, and the persistent disconnect between laboratory and field applications. Therefore, a firm scientific basis for predicting silicate reaction kinetics in CO2 injected geological formations is urgently needed to assure the reliability of the geochemical models used for the assessments of carbon sequestration strategies. The funded experimental and theoretical study attempts to resolve this outstanding scientific issue by novel experimental design and theoretical interpretation to measure silicate dissolution rates and iron carbonate precipitation rates at conditions pertinent to geological carbon sequestration. In the second year of the project, we completed CO{sub 2}-Navajo sandstone interaction batch and flow-through experiments and a Navajo sandstone dissolution experiment without the presence of CO{sub 2} at 200 C and 250-300 bars, and initiated dawsonite dissolution and solubility experiments. We also performed additional 5-day experiments at the

  5. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Sergi, Pier Nicola, E-mail: p.sergi@sssup.it [Translational Neural Engineering Laboratory, The Biorobotics Institute, Scuola Superiore Sant' Anna, Viale Rinaldo Piaggio 34, Pontedera, 56025 (Italy); Jensen, Winnie [Department of Health Science and Technology, Fredrik Bajers Vej 7, 9220 Aalborg (Denmark); Yoshida, Ken [Department of Biomedical Engineering, Indiana University - Purdue University Indianapolis, 723 W. Michigan St., SL220, Indianapolis, IN 46202 (United States)

    2016-02-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature.

  6. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    International Nuclear Information System (INIS)

    Sergi, Pier Nicola; Jensen, Winnie; Yoshida, Ken

    2016-01-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature

  7. Effectiveness of different approaches to disseminating traveler information on travel time reliability.

    Science.gov (United States)

    2014-01-01

    The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...

  8. Experimental aspects of buoyancy correction in measuring reliable highpressure excess adsorption isotherms using the gravimetric method.

    Science.gov (United States)

    Nguyen, Huong Giang T; Horn, Jarod C; Thommes, Matthias; van Zee, Roger D; Espinal, Laura

    2017-12-01

    Addressing reproducibility issues in adsorption measurements is critical to accelerating the path to discovery of new industrial adsorbents and to understanding adsorption processes. A National Institute of Standards and Technology Reference Material, RM 8852 (ammonium ZSM-5 zeolite), and two gravimetric instruments with asymmetric two-beam balances were used to measure high-pressure adsorption isotherms. This work demonstrates how common approaches to buoyancy correction, a key factor in obtaining the mass change due to surface excess gas uptake from the apparent mass change, can impact the adsorption isotherm data. Three different approaches to buoyancy correction were investigated and applied to the subcritical CO 2 and supercritical N 2 adsorption isotherms at 293 K. It was observed that measuring a collective volume for all balance components for the buoyancy correction (helium method) introduces an inherent bias in temperature partition when there is a temperature gradient (i.e. analysis temperature is not equal to instrument air bath temperature). We demonstrate that a blank subtraction is effective in mitigating the biases associated with temperature partitioning, instrument calibration, and the determined volumes of the balance components. In general, the manual and subtraction methods allow for better treatment of the temperature gradient during buoyancy correction. From the study, best practices specific to asymmetric two-beam balances and more general recommendations for measuring isotherms far from critical temperatures using gravimetric instruments are offered.

  9. A Bayesian reliability approach to the performance assessment of a geological waste repository

    International Nuclear Information System (INIS)

    Flueck, J.A.; Singh, A.K.

    1992-01-01

    This paper discusses the task of selecting a suitable site for a high-level waste disposal repository (HLWR) which certainly is a complex one in that one must address both engineering and economic factors of the proposed facility and site as well as environmental, public health, safety, and sociopolitical factors. Acknowledging the complexity of the siting problem for a HLWR leads one to readily conclude that a formal analysis, including the use of a performance assessment model (PAM), is needed to assist the designated decision makers in their task of selecting a suitable site. The overall goal of a PAM is to aid the decision makers in making the best possible technical siting decision. For a number of reason, the authors believe that the combining of both Bayesian decision theory and reliability methodology provides the best approach to constructing a useful PAM for assisting in the siting of a HLWR. This combination allows one to formally integrate existing relevant information, professional judgement, and component model outputs to produce conditionally estimated probabilities for a decision tree approach to the radionuclide release problem of a proposed HLWR. If loss functions are available, this also allows one to calculate the expected costs or losses from possible radionuclide releases. This latter calculation may be very important in selecting the final site from among a number of alternative sites

  10. A unified approach to validation, reliability, and education study design for surgical technical skills training.

    Science.gov (United States)

    Sweet, Robert M; Hananel, David; Lawrenz, Frances

    2010-02-01

    To present modern educational psychology theory and apply these concepts to validity and reliability of surgical skills training and assessment. In a series of cross-disciplinary meetings, we applied a unified approach of behavioral science principles and theory to medical technical skills education given the recent advances in the theories in the field of behavioral psychology and statistics. While validation of the individual simulation tools is important, it is only one piece of a multimodal curriculum that in and of itself deserves examination and study. We propose concurrent validation throughout the design of simulation-based curriculum rather than once it is complete. We embrace the concept that validity and curriculum development are interdependent, ongoing processes that are never truly complete. Individual predictive, construct, content, and face validity aspects should not be considered separately but as interdependent and complementary toward an end application. Such an approach could help guide our acceptance and appropriate application of these exciting new training and assessment tools for technical skills training in medicine.

  11. Experimental support of WWER-440 fuel reliability and serviceability at high burnup

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A; Ivanov, V; Pnyushkin, A [Nauchno-Issledovatel` skij Inst. Atomnykh Reaktorov, Dimitrovgrad (Russian Federation); Tzibulya, V [AO Mashinostroitelnij Zavod Electrostal (Russian Federation); Kolosovsky, V; Bibilashvili, Yu [Vsesoyuznyj Nauchno-Issledovatel` skij Inst. Neorganicheskikh Materialov, Moscow (Russian Federation); Dubrovin, K [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation)

    1994-12-31

    Results from post-reactor examination of two WWER-440 fuel assemblies spent at the Kola NPP Unit 3 during 4 and 5 fuel cycles are presented. The fuel assembly states and their serviceability allowance are estimated experimentally at the RIAR hot laboratory and studied by non-destructive and destructive methods. The following parameters are examined: fuel assembly overall dimensions change; fuel element diameter change; fuel element cladding corrosion and hydriding; fuel element cladding mechanical properties; fission gas release from fuel and gas pressure; fuel macro- and microstructure. it has been found that the maximum fuel burnup of fuel assemblies No. 1 and No.2 achieved is 58.3 and 64.0 MWd/kg, respectively. The mechanical fuel pellets-cladding interaction has been observed at the average fuel burnup above 45 MWd/kg that occurred with increasing the local cladding diameter at the areas of pellets end arrangement (bamboo stick). The gas release linearly increases at the range 2.7% per 10 MWd/kg within burnup of 43-60 MWd/kg. 9 figs., 3 refs.

  12. Science and society: different bioethical approaches towards animal experimentation.

    Science.gov (United States)

    Brom, Frans W A

    2002-01-01

    The use of live animals for experiments plays an important role in many forms of research. This gives rise to an ethical dilemma. On the one hand, most of the animals used are sentient beings who may be harmed by the experiments. The research, on the other hand, may be vital for preventing, curing or alleviating human diseases. There is no consensus on how to tackle this dilemma. One extreme is the view taken by adherents of the so-called animal rights view. According to this view, we are never justified in harming animals for human purposes - however vital these purposes may be. The other extreme is the ruthless view, according to which animals are there to be used at our discretion. However, most people have a view situated somewhere between these two extremes. It is accepted that animals may be used for research - contrary to the animal rights view. However, contrary to the ruthless view, that is only accepted under certain conditions. The aim of this presentation is to present different ethical views which may serve as a foundation for specifying the circumstances under which it is acceptable to use animals for research. Three views serving this role are contractarianism, utilitarianism and a deontological approach. According to contractarianism, the key ethical issue is concern for the sentiments of other human beings in society, on whose co-operation those responsible for research depend. Thus it is acceptable to use animals as long as most people can see the point of the experiment and are not offended by the way it is done. According to utilitarianism, the key ethical issue is about the consequences for humans and animals. Thus it is justified to use animals for research if enough good comes out of it in terms of preventing suffering and creating happiness, and if there is no better alternative. In the deontological approach the prima facie duty of beneficence towards human beings has to be weighed against the prima facie duties not to harm animals and to

  13. Approaches to safety, environment and regulatory approval for the International Thermonuclear Experimental Reactor

    International Nuclear Information System (INIS)

    Saji, G.; Bartels, H.W.; Chuyanov, V.; Holland, D.; Kashirski, A.V.; Morozov, S.I.; Piet, S.J.; Poucet, A.; Raeder, J.; Rebut, P.H.; Topilski, L.N.

    1995-01-01

    International Thermonuclear Experimental Reactor (ITER) Engineering Design Activities (EDA) in safety and environment are approaching the point where conceptual safety design, topic studies and research will give way to project oriented engineering design activities. The Joint Central Team (JCT) is promoting safety design and analysis necessary for siting and regulatory approval. Scoping studies are underway at the general level, in terms of laying out the safety and environmental design framework for ITER. ITER must follow the nuclear regulations of the host country as the future construction site of ITER. That is, regulatory approval is required before construction of ITER. Thus, during the EDA, some preparations are necessary for the future application for regulatory approval. Notwithstanding the future host country's jurisdictional framework of nuclear regulations, the primary responsibility for safety and reliability of ITER rests with the legally responsible body which will operate ITER. Since scientific utilization of ITER and protection of the large investment depends on safe and reliable operation of ITER, we are highly motivated to achieve maximum levels of operability, maintainability, and safety. ITER will be the first fusion facility in which overall 'nuclear safety' provisions need to be integrated into the facility. For example, it will be the first fusion facility with significant decay heat and structural radiational damage. Since ITER is an experimental facility, it is also important that necessary experiments can be performed within some safety design limits without requiring extensive regulatory procedures. ITER will be designed with such a robust safety envelope compatible with the fusion power and the energy inventories. The basic approach to safety will be realized by 'defense-in-depth'. (orig.)

  14. Reliability of mechanisms with periodic random modal frequencies using an extreme value-based approach

    International Nuclear Information System (INIS)

    Savage, Gordon J.; Zhang, Xufang; Son, Young Kap; Pandey, Mahesh D.

    2016-01-01

    Resonance in a dynamic system is to be avoided since it often leads to impaired performance, overstressing, fatigue fracture and adverse human reactions. Thus, it is necessary to know the modal frequencies and ensure they do not coincide with any applied periodic loadings. For a rotating planar mechanism, the coefficients in the mass and stiffness matrices are periodically varying, and if the underlying geometry and material properties are treated as random variables then the modal frequencies are both position-dependent and probabilistic. The avoidance of resonance is now a complex problem. Herein, free vibration analysis helps determine ranges of modal frequencies that in turn, identify the running speeds of the mechanism to be avoided. This paper presents an efficient and accurate sample-based approach to determine probabilistic minimum and maximum extremes of the fundamental frequencies and the angular positions of their occurrence. Then, given critical lower and upper frequency constraints it is straightforward to determine reliability in terms of probability of exceedance. The novelty of the proposed approach is that the original expensive and implicit mechanistic model is replaced by an explicit meta-model that captures the tolerances of the design variables over the entire range of angular positions: position-dependent eigenvalues can be found easily and quickly. Extreme-value statistics of the modal frequencies and extreme-value statistics of the angular positions are readily computed through MCS. Limit-state surfaces that connect the frequencies to the design variables may be easily constructed. Error analysis identifies three errors and the paper presents ways to control them so the methodology can be sufficiently accurate. A numerical example of a flexible four-bar linkage shows the proposed methodology has engineering applications. The impact of the proposed methodology is two-fold: it presents a safe-side analysis based on free vibration methods to

  15. The development of a nuclear chemical plant human reliability management approach: HRMS and JHEDI

    International Nuclear Information System (INIS)

    Kirwan, Barry

    1997-01-01

    In the late 1980's, amidst the qualitative and quantitative validation of certain Human Reliability Assessment (HRA) techniques, there was a desire for a new technique specifically for a nuclear reprocessing plant being designed. The technique was to have the following attributes: it should be data-based rather than involving pure expert judgement; it was to be flexible, so that it would allow both relatively rapid screening and more detailed assessment; and it was to have sensitivity analysis possibilities, so that Human Factors design-related parameters, albeit at a gross level, could be brought into the risk assessment equation. The techniques and literature were surveyed, and it was decided that no one technique fulfilled these requirements, and so a new approach was developed. Two techniques were devised, the Human Reliability Management System (HRMS), and the Justification of Human Error Data Information (JHEDI) technique, the latter being essentially a quicker screening version of the former. Both techniques carry out task analysis, error analysis, and Performance Shaping Factor-based quantification, but JHEDI involves less detailed assessment than HRMS. Additionally, HRMS can be utilised to determine error reduction mechanisms, based on the way the Performance Shaping Factors are contributing to the assessed error probabilities. Both techniques are fully computerised and assessments are highly documentable and auditable, which was seen as a useful feature both by the company developing the techniques, and by the regulatory authorities assessing the final output risk assessments into which these two techniques fed data. This paper focuses in particular on the quantification process used by these techniques. The quantification approach for both techniques was principally one of extrapolation from real data to the desired Human Error Probability (HEP), based on a comparison between Performance Shaping Factor (PSF) profiles for the real, and the to

  16. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  17. Reliability Evaluation and Improvement Approach of Chemical Production Man - Machine - Environment System

    Science.gov (United States)

    Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng

    2017-12-01

    In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.

  18. Surface laser marking optimization using an experimental design approach

    Science.gov (United States)

    Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.

    2017-04-01

    Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.

  19. Experimental approach for adhesion strength of ATF cladding

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Donghyun; Kim, Hyochan; Yang, Yongsik; In, Wangkee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Haksung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The quality of a coating depends on the quality of its adhesion bond strength between the coating and the underlying substrate. Therefore, it is essential to evaluate the adhesion properties of the coating. There are many available test methods for the evaluation of coatings adhesion bond strength. Considering these restrictions of the coated cladding, the scratch test is useful for evaluation of adhesion properties compared to other methods. The purpose of the present study is to analyze the possibility of adhesion bond strength evaluation of ATF coated cladding by scratch testing on coatings cross sections. Experimental approach for adhesion strength of ATF coated cladding was investigated in the present study. The scratch testing was chosen as a testing method. Uncoated zircaloy-4 tube was employed as a reference and plasma spray and arc ion coating were selected as a ATF coated claddings for comparison. As a result, adhesion strengths of specimens affect the measured normal and tangential forces. For the future, the test will be conducted for CrAl coated cladding by laser coating, which is the most promising ATF cladding. Computational analysis with finite element method will also be conducted to analyze a stress distribution in the cladding tube.

  20. An experimental approach of decoupling Seebeck coefficient and electrical resistivity

    Science.gov (United States)

    Muhammed Sabeer N., A.; Paulson, Anju; Pradyumnan, P. P.

    2018-04-01

    The Thermoelectrics (TE) has drawn increased attention among renewable energy technologies. The performance of a thermoelectric material is quantified by a dimensionless thermoelectric figure of merit, ZT=S2σT/κ, where S and σ vary inversely each other. Thus, improvement in ZT is not an easy task. So, researchers have been trying different parameter variations during thin film processing to improve TE properties. In this work, tin nitride (Sn3N4) thin films were deposited on glass substrates by reactive RF magnetron sputtering and investigated its thermoelectric response. To decouple the covariance nature of Seebeck coefficient and electrical resistivity for the enhancement of power factor (S2σ), the nitrogen gas pressure during sputtering was reduced. Reduction in nitrogen gas pressure reduced both sputtering pressure and amount of nitrogen available for reaction during sputtering. This experimental approach of combined effect introduced preferred orientation and stoichiometric variations simultaneously in the sputtered Sn3N4 thin films. The scattering mechanism associated with these variations enhanced TE properties by independently drive the Seebeck coefficient and electrical resistivity parameters.

  1. Acting like a physicist: Student approach study to experimental design

    Science.gov (United States)

    Karelina, Anna; Etkina, Eugenia

    2007-12-01

    National studies of science education have unanimously concluded that preparing our students for the demands of the 21st century workplace is one of the major goals. This paper describes a study of student activities in introductory college physics labs, which were designed to help students acquire abilities that are valuable in the workplace. In these labs [called Investigative Science Learning Environment (ISLE) labs], students design their own experiments. Our previous studies have shown that students in these labs acquire scientific abilities such as the ability to design an experiment to solve a problem, the ability to collect and analyze data, the ability to evaluate assumptions and uncertainties, and the ability to communicate. These studies mostly concentrated on analyzing students’ writing, evaluated by specially designed scientific ability rubrics. Recently, we started to study whether the ISLE labs make students not only write like scientists but also engage in discussions and act like scientists while doing the labs. For example, do students plan an experiment, validate assumptions, evaluate results, and revise the experiment if necessary? A brief report of some of our findings that came from monitoring students’ activity during ISLE and nondesign labs was presented in the Physics Education Research Conference Proceedings. We found differences in student behavior and discussions that indicated that ISLE labs do in fact encourage a scientistlike approach to experimental design and promote high-quality discussions. This paper presents a full description of the study.

  2. Acting like a physicist: Student approach study to experimental design

    Directory of Open Access Journals (Sweden)

    Anna Karelina

    2007-10-01

    Full Text Available National studies of science education have unanimously concluded that preparing our students for the demands of the 21st century workplace is one of the major goals. This paper describes a study of student activities in introductory college physics labs, which were designed to help students acquire abilities that are valuable in the workplace. In these labs [called Investigative Science Learning Environment (ISLE labs], students design their own experiments. Our previous studies have shown that students in these labs acquire scientific abilities such as the ability to design an experiment to solve a problem, the ability to collect and analyze data, the ability to evaluate assumptions and uncertainties, and the ability to communicate. These studies mostly concentrated on analyzing students’ writing, evaluated by specially designed scientific ability rubrics. Recently, we started to study whether the ISLE labs make students not only write like scientists but also engage in discussions and act like scientists while doing the labs. For example, do students plan an experiment, validate assumptions, evaluate results, and revise the experiment if necessary? A brief report of some of our findings that came from monitoring students’ activity during ISLE and nondesign labs was presented in the Physics Education Research Conference Proceedings. We found differences in student behavior and discussions that indicated that ISLE labs do in fact encourage a scientistlike approach to experimental design and promote high-quality discussions. This paper presents a full description of the study.

  3. Experimental Approach on the Cognitive Perception of Historical Urban Skyline

    Directory of Open Access Journals (Sweden)

    Seda H. Bostancı

    2017-12-01

    display of a skyline can be discussed through this experimental approach. This study aims to do experimental research among a group of architecture students who are strong at drawing and schematic expressions. The selected group of samples will be asked to draw (1 the schematic skyline images of the city they live in and a city they have visited as far as they remember, (2 examined how they draw a skyline and how much time it takes after they are shown a skyline of a historical city chosen in a certain time, (3 watch a video on the streets of two different cities they have seen or haven't seen before, and asked to draw a skyline of the city based on what they have watched. Finally, these different situations will be analyzed. In the experimental study, After 3 days, drawing the best remembered skyline image will be requested from students. And what the sample group have thought in this selection in terms of aesthetics will be measured with the semantic differential and the adjective pairs. Participants will be asked to draw the catchy image of the skyline shown in order to compare the experimental methods and the subjective aesthetic evaluation methods. Observation-based determinations will be realized by the analysis of these drawings and the adjective pairs. In this way, the relation between the skyline perception and the aesthetic experience in urban life will be discussed.

  4. Trading Water Conservation Credits: A Coordinative Approach for Enhanced Urban Water Reliability

    Science.gov (United States)

    Gonzales, P.; Ajami, N. K.

    2016-12-01

    Water utilities in arid and semi-arid regions are increasingly relying on water use efficiency and conservation to extend the availability of supplies. Despite spatial and institutional inter-dependency of many service providers, these demand-side management initiatives have traditionally been tackled by individual utilities operating in a silo. In this study, we introduce a new approach to water conservation that addresses regional synergies—a novel system of tradable water conservation credits. Under the proposed approach, utilities have the flexibility to invest in water conservation measures that are appropriate for their specific service area. When utilities have insufficient capacity for local cost-effective measures, they may opt to purchase credits, contributing to fund subsidies for utilities that do have that capacity and can provide the credits, while the region as whole benefits from more reliable water supplies. While similar programs have been used to address water quality concerns, to our knowledge this is one of the first studies proposing tradable credits for incentivizing water conservation. Through mathematical optimization, this study estimates the potential benefits of a trading program and demonstrates the institutional and economic characteristics needed for such a policy to be viable, including a proposed web platform to facilitate transparent regional planning, data-driven decision-making, and enhanced coordination of utilities. We explore the impacts of defining conservation targets tailored to local realities of utilities, setting credit prices, and different policy configurations. We apply these models to the case study of water utility members of the Bay Area Water Supply and Conservation Agency. Preliminary work shows that the diverse characteristics of these utilities present opportunities for the region to achieve conservation goals while maximizing the benefits to individual utilities through more flexible coordinative efforts.

  5. Personalized translational epilepsy research - Novel approaches and future perspectives: Part II: Experimental and translational approaches.

    Science.gov (United States)

    Bauer, Sebastian; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Rosenow, Felix

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics, and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. This Part II includes the experimental and translational approaches and a discussion of the future perspectives, while the diagnostic methods, EEG network analysis, biomarkers, and personalized treatment approaches were addressed in Part I [1]. Copyright © 2017

  6. A GC/MS-based metabolomic approach for reliable diagnosis of phenylketonuria.

    Science.gov (United States)

    Xiong, Xiyue; Sheng, Xiaoqi; Liu, Dan; Zeng, Ting; Peng, Ying; Wang, Yichao

    2015-11-01

    Although the phenylalanine/tyrosine ratio in blood has been the gold standard for diagnosis of phenylketonuria (PKU), the disadvantages of invasive sample collection and false positive error limited the application of this discriminator in the diagnosis of PKU to some extent. The aim of this study was to develop a new standard with high sensitivity and specificity in a less invasive manner for diagnosing PKU. In this study, an improved oximation-silylation method together with GC/MS was utilized to obtain the urinary metabolomic information in 47 PKU patients compared with 47 non-PKU controls. Compared with conventional oximation-silylation methods, the present approach possesses the advantages of shorter reaction time and higher reaction efficiency at a considerably lower temperature, which is beneficial to the derivatization of some thermally unstable compounds, such as phenylpyruvic acid. Ninety-seven peaks in the chromatograms were identified as endogenous metabolites by the National Institute of Standards and Technology (NIST) mass spectra library, including amino acids, organic acids, carbohydrates, amides, and fatty acids. After normalization of data using creatinine as internal standard, 19 differentially expressed compounds with p values of <0.05 were selected by independent-sample t test for the separation of the PKU group and the control group. A principal component analysis (PCA) model constructed by these differentially expressed compounds showed that the PKU group can be discriminated from the control group. Receiver-operating characteristic (ROC) analysis with area under the curve (AUC), specificity, and sensitivity of each PKU marker obtained from these differentially expressed compounds was used to evaluate the possibility of using these markers for diagnosing PKU. The largest value of AUC (0.987) with high specificity (0.936) and sensitivity (1.000) was obtained by the ROC curve of phenylacetic acid at its cutoff value (17.244 mmol/mol creatinine

  7. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  8. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  9. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  10. A flexible latent class approach to estimating test-score reliability

    NARCIS (Netherlands)

    van der Palm, D.W.; van der Ark, L.A.; Sijtsma, K.

    2014-01-01

    The latent class reliability coefficient (LCRC) is improved by using the divisive latent class model instead of the unrestricted latent class model. This results in the divisive latent class reliability coefficient (DLCRC), which unlike LCRC avoids making subjective decisions about the best solution

  11. Management of reliability and maintainability; a disciplined approach to fleet readiness

    Science.gov (United States)

    Willoughby, W. J., Jr.

    1981-01-01

    Material acquisition fundamentals were reviewed and include: mission profile definition, stress analysis, derating criteria, circuit reliability, failure modes, and worst case analysis. Military system reliability was examined with emphasis on the sparing of equipment. The Navy's organizational strategy for 1980 is presented.

  12. Reliability Prediction Approaches For Domestic Intelligent Electric Energy Meter Based on IEC62380

    Science.gov (United States)

    Li, Ning; Tong, Guanghua; Yang, Jincheng; Sun, Guodong; Han, Dongjun; Wang, Guixian

    2018-01-01

    The reliability of intelligent electric energy meter is a crucial issue considering its large calve application and safety of national intelligent grid. This paper developed a procedure of reliability prediction for domestic intelligent electric energy meter according to IEC62380, especially to identify the determination of model parameters combining domestic working conditions. A case study was provided to show the effectiveness and validation.

  13. New experimental approaches to the biology of flight control systems.

    Science.gov (United States)

    Taylor, Graham K; Bacic, Marko; Bomphrey, Richard J; Carruthers, Anna C; Gillies, James; Walker, Simon M; Thomas, Adrian L R

    2008-01-01

    Here we consider how new experimental approaches in biomechanics can be used to attain a systems-level understanding of the dynamics of animal flight control. Our aim in this paper is not to provide detailed results and analysis, but rather to tackle several conceptual and methodological issues that have stood in the way of experimentalists in achieving this goal, and to offer tools for overcoming these. We begin by discussing the interplay between analytical and empirical methods, emphasizing that the structure of the models we use to analyse flight control dictates the empirical measurements we must make in order to parameterize them. We then provide a conceptual overview of tethered-flight paradigms, comparing classical ;open-loop' and ;closed-loop' setups, and describe a flight simulator that we have recently developed for making flight dynamics measurements on tethered insects. Next, we provide a conceptual overview of free-flight paradigms, focusing on the need to use system identification techniques in order to analyse the data they provide, and describe two new techniques that we have developed for making flight dynamics measurements on freely flying birds. First, we describe a technique for obtaining inertial measurements of the orientation, angular velocity and acceleration of a steppe eagle Aquila nipalensis in wide-ranging free flight, together with synchronized measurements of wing and tail kinematics using onboard instrumentation and video cameras. Second, we describe a photogrammetric method to measure the 3D wing kinematics of the eagle during take-off and landing. In each case, we provide demonstration data to illustrate the kinds of information available from each method. We conclude by discussing the prospects for systems-level analyses of flight control using these techniques and others like them.

  14. A non-invasive experimental approach for surface temperature measurements on semi-crystalline thermoplastics

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Remi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2017-10-01

    Most of the thermoforming processes of thermoplastic polymers and their composites are performed adopting a combined heating and forming stages at which a precursor is heated prior to the forming. This step is done in order to improve formability by softening the thermoplastic polymer. Due to low thermal conductivity and semi-transparency of polymers, infrared (IR) heating is widely used for thermoforming of such materials. Predictive radiation heat transfer models for temperature distributions are therefore critical for optimizations of thermoforming process. One of the key challenges is to build a predictive model including the physical background of radiation heat transfer phenomenon in semi-crystalline thermoplastics as their microcrystalline structure introduces an optically heterogeneous medium. In addition, the accuracy of a predictive model is required to be validated experimentally where IR thermography is one of the suitable methods for such a validation as it provides a non-invasive, full-field surface temperature measurement. Although IR cameras provide a non-invasive measurement, a key issue for obtaining a reliable measurement depends on the optical characteristics of a heated material and the operating spectral band of IR camera. It is desired that the surface of a material to be measured has a spectral band where the material behaves opaque and an employed IR camera operates in the corresponding band. In this study, the optical characteristics of the PO-based polymer are discussed and, an experimental approach is proposed in order to measure the surface temperature of the PO-based polymer via IR thermography. The preliminary analyses showed that IR thermographic measurements may not be simply performed on PO-based polymers and require a correction method as their semi-transparent medium introduce a challenge to obtain reliable surface temperature measurements.

  15. Experimental and Modeling Approaches for Understanding the Effect of Gene Expression Noise in Biological Development

    Directory of Open Access Journals (Sweden)

    David M. Holloway

    2018-04-01

    Full Text Available Biological development involves numerous chemical and physical processes which must act in concert to reliably produce a cell, a tissue, or a body. To be successful, the developing organism must be robust to variability at many levels, such as the environment (e.g., temperature, moisture, upstream information (such as long-range positional information gradients, or intrinsic noise due to the stochastic nature of low concentration chemical kinetics. The latter is especially relevant to the regulation of gene expression in cell differentiation. The temporal stochasticity of gene expression has been studied in single celled organisms for nearly two decades, but only recently have techniques become available to gather temporally-resolved data across spatially-distributed gene expression patterns in developing multicellular organisms. These demonstrate temporal noisy “bursting” in the number of gene transcripts per cell, raising the question of how the transcript number defining a particular cell type is produced, such that one cell type can reliably be distinguished from a neighboring cell of different type along a tissue boundary. Stochastic spatio-temporal modeling of tissue-wide expression patterns can identify signatures for specific types of gene regulation, which can be used to extract regulatory mechanism information from experimental time series. This Perspective focuses on using this type of approach to study gene expression noise during the anterior-posterior segmentation of the fruit fly embryo. Advances in experimental and theoretical techniques will lead to an increasing quantification of expression noise that can be used to understand how regulatory mechanisms contribute to embryonic robustness across a range of developmental processes.

  16. Myth of the Master Detective: Reliability of Interpretations for Kaufman's "Intelligent Testing" Approach to the WISC-III.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1997-01-01

    Used computer simulation to examine the reliability of interpretations for Kaufman's "intelligent testing" approach to the Wechsler Intelligence Scale for Children (3rd ed.) (WISC-III). Findings indicate that factor index-score differences and other measures could not be interpreted with confidence. Argues that limitations of IQ testing…

  17. New approach for high reliability, low loss splicing between silica and ZBLAN fibers

    Science.gov (United States)

    Carbonnier, Robin; Zheng, Wenxin

    2018-02-01

    In the past decade, ZBLAN (ZrF4-BaF2-LaF3-NaF) fibers have drawn increasing interest for laser operations at wavelengths where Fused Silica-based (SiO2) fibers do not perform well. One limitation to the expansion of ZBLAN fiber lasers today is the difficulty to efficiently inject and extract light in/from the guiding medium using SiO2 fibers. Although free space and butt coupling have provided acceptable results, consistent and long lasting physical joints between SiO2 and ZBLAN fibers will allow smaller, cheaper, and more robust component manufacturing. While low loss splices have been reported using a traditional splicing approach, the very low mechanical strength of the joint makes it difficult to scale. Difficulties in achieving a strong bond are mainly due to the large difference of transition temperature between ZBLAN and SiO2 fibers ( 260°C vs 1175°C). This paper presents results obtained by using the high thermal expansion coefficient of the ZBLAN fiber to encapsulate a smaller SiO2 fiber. A CO2 laser glass processing system was used to control the expansion and contraction of the ZBLAN material during the splicing process for optimum reliability. This method produced splices between 125μm ZBLAN to 80μm SiO2 fibers with average transmission loss of 0.225dB (measured at 1550nm) and average ultimate tension strength of 121.4gf. The Resulting splices can be durably packaged without excessive care. Other combinations using 125μm SiO2 fibers tapered to 80μm are also discussed.

  18. Reliable Portfolio Selection Problem in Fuzzy Environment: An mλ Measure Based Approach

    Directory of Open Access Journals (Sweden)

    Yuan Feng

    2017-04-01

    Full Text Available This paper investigates a fuzzy portfolio selection problem with guaranteed reliability, in which the fuzzy variables are used to capture the uncertain returns of different securities. To effectively handle the fuzziness in a mathematical way, a new expected value operator and variance of fuzzy variables are defined based on the m λ measure that is a linear combination of the possibility measure and necessity measure to balance the pessimism and optimism in the decision-making process. To formulate the reliable portfolio selection problem, we particularly adopt the expected total return and standard variance of the total return to evaluate the reliability of the investment strategies, producing three risk-guaranteed reliable portfolio selection models. To solve the proposed models, an effective genetic algorithm is designed to generate the approximate optimal solution to the considered problem. Finally, the numerical examples are given to show the performance of the proposed models and algorithm.

  19. Developing a certifiable UAS reliability assessment approach through algorithmic redundancy, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Manned aircraft, civilian or military, are required to meet certain reliability standards specified by the FAA in order to operate in the US national airspace. These...

  20. Reliability Analysis of Retaining Walls Subjected to Blast Loading by Finite Element Approach

    Science.gov (United States)

    GuhaRay, Anasua; Mondal, Stuti; Mohiuddin, Hisham Hasan

    2018-02-01

    Conventional design methods adopt factor of safety as per practice and experience, which are deterministic in nature. The limit state method, though not completely deterministic, does not take into account effect of design parameters, which are inherently variable such as cohesion, angle of internal friction, etc. for soil. Reliability analysis provides a measure to consider these variations into analysis and hence results in a more realistic design. Several studies have been carried out on reliability of reinforced concrete walls and masonry walls under explosions. Also, reliability analysis of retaining structures against various kinds of failure has been done. However, very few research works are available on reliability analysis of retaining walls subjected to blast loading. Thus, the present paper considers the effect of variation of geotechnical parameters when a retaining wall is subjected to blast loading. However, it is found that the variation of geotechnical random variables does not have a significant effect on the stability of retaining walls subjected to blast loading.

  1. Validity and reliability of three definitions of hip osteoarthritis: cross sectional and longitudinal approach

    NARCIS (Netherlands)

    M. Reijman (Max); J.M.W. Hazes (Mieke); H.A.P. Pols (Huib); R.M.D. Bernsen (Roos); B.W. Koes (Bart); S.M. Bierma-Zeinstra (Sita)

    2004-01-01

    textabstractOBJECTIVES: To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the

  2. Effectiveness of different approaches to disseminating traveler information on travel time reliability. [supporting datasets

    Science.gov (United States)

    2013-11-30

    Travel time reliability information includes static data about traffic speeds or trip times that capture historic variations from day to day, and it can help individuals understand the level of variation in traffic. Unlike real-time travel time infor...

  3. Herbal Medicines: from Traditional Medicine to Modern Experimental Approaches

    Directory of Open Access Journals (Sweden)

    Bahram Rasoulian

    2017-03-01

    Full Text Available Academic writings indicate that the medicinal use of plants dates back to 4000 - 5000 B.C. (1. Utilization of medicinal herbs has indeed a long history not only in human's life, but also in animals and there are some interesting evidences about the animals' self-medication, in both the prevention and treatment of diseases (2-5. The World Health Organization (WHO has recognized the importance of traditional medicines and created strategies, guidelines and standards for botanical medicines (6, 7. A significant part of those traditional text dealing with medicine, which were appreciated by ancient scientific communities worldwide, such as The Canon of Medicine by Persian physician–philosopher Ibn Sina (or Avicenna, 980 to 1032 AD, is allocated to herbal medicines. The Canon explores nearly 500 medicinal plants and herbal drugs. It should be noted that this book was used as a medical textbook in Europe until the 17th century AD (8, 9. Although there are important evidences about using some kinds of experimental approaches in traditional medicine (8, the efficacy of such approaches is in doubt because it is generally agreed that they might have been part of physicians' personal experiences. Not only the demand for herbal drugs is growing in developing countries, but also there are some evidences that consumers in developed countries are becoming disillusioned with modern healthcare; hence, the demand for traditional alternatives including herbal medicines is increasing in developing countries (10. On the one hand, the increased interest in herbal medicines throughout the world (10, 11, on the other hand, the need for direct empirical evidence about the effectiveness of herbal medicines in the proper statistical society with the appropriate number and method, denote the significance of new studies about medicinal plants and publishing their results. Herbal Medicines Journal (eISSN: 2538-2144 reports valuable research results for researchers all

  4. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  5. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    OpenAIRE

    Steven E. Stemler

    2004-01-01

    This article argues that the general practice of describing interrater reliability as a single, unified concept is..at best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1) consensus estimates, 2) cons...

  6. Validity and reliability of three definitions of hip osteoarthritis: cross sectional and longitudinal approach

    OpenAIRE

    Reijman, Max; Hazes, Mieke; Pols, Huib; Bernsen, Roos; Koes, Bart; Bierma-Zeinstra, Sita

    2004-01-01

    textabstractOBJECTIVES: To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the validity of the three definitions of hip OA is sex dependent. METHODS: SUBJECTS: from the Rotterdam study (aged > or= 55 years, n = 3585) were evaluated. The inter-rater reliability was tested in a random set of 148 x rays. ...

  7. Reliable Gait Recognition Using 3D Reconstructions and Random Forests - An Anthropometric Approach

    DEFF Research Database (Denmark)

    Sandau, Martin; Heimbürger, Rikke V.; Jensen, Karl E.

    2016-01-01

    reliable recognition. Sixteen participants performed normal walking where 3D reconstructions were obtained continually. Segment lengths and kinematics from the extremities were manually extracted by eight expert observers. The results showed that all the participants were recognized, assuming the same...... expert annotated the data. Recognition based on data annotated by different experts was less reliable achieving 72.6% correct recognitions as some parameters were heavily affected by interobserver variability. This study verified that 3D reconstructions are feasible for forensic gait analysis...

  8. LMFBR subassembly response to local pressure loadings: an experimental approach

    International Nuclear Information System (INIS)

    Marciniak, T.J.; Ash, J.E.; Marchertas, A.H.; Cagliostro, D.J.

    1975-01-01

    An experimental program to determine the response of LMFBR-type subassemblies to local subassembly accidents caused by pressure loadings is described. Some results are presented and compared with computer calculations

  9. Experimental and computational approaches to electrical conductor loading characteristics

    International Nuclear Information System (INIS)

    Vary, M.; Goga, V.; Paulech, J.

    2012-01-01

    This article describes cooling analyses of horizontally arranged bare electric conductor using analytical and numerical methods. Results of these analyses will be compared to the results obtained from experimental measurement. (Authors)

  10. A reliability-based approach of fastest routes planning in dynamic traffic network under emergency management situation

    Directory of Open Access Journals (Sweden)

    Ye Sun

    2011-12-01

    Full Text Available In order to establish an available emergency management system, it is important to conduct effective evacuation with reliable and real time optimal route plans. This paper aims at creating a route finding strategy by considering the time dependent factors as well as uncertainties that may be encountered during the emergency management system. To combine dynamic features with the level of reliability in the process of fastest route planning, the speed distribution of typical intercity roads is studied in depth, and the strategy of modifying real time speed to a more reliable value based on speed distribution is proposed. Two algorithms of route planning have been developed to find three optimal routes with the shortest travel time and the reliability of 0.9. In order to validate the new strategy, experimental implementation of the route planning method is conducted based on road speed information acquired by field study. The results show that the proposed strategy might provide more reliable routes in dynamic traffic networks by conservatively treating roads with large speed discretion or with relative extreme real speed value.

  11. Fracture reduction and primary ankle arthrodesis: a reliable approach for severely comminuted tibial pilon fracture.

    Science.gov (United States)

    Beaman, Douglas N; Gellman, Richard

    2014-12-01

    Posttraumatic arthritis and prolonged recovery are typical after a severely comminuted tibial pilon fracture, and ankle arthrodesis is a common salvage procedure. However, few reports discuss the option of immediate arthrodesis, which may be a potentially viable approach to accelerate overall recovery in patients with severe fracture patterns. (1) How long does it take the fracture to heal and the arthrodesis to fuse when primary ankle arthrodesis is a component of initial fracture management? (2) How do these patients fare clinically in terms of modified American Orthopaedic Foot and Ankle Society (AOFAS) scores and activity levels after this treatment? (3) Does primary ankle arthrodesis heal in an acceptable position when anterior ankle arthrodesis plates are used? During a 2-year period, we performed open fracture reduction and internal fixation in 63 patients. Eleven patients (12 ankles) with severely comminuted high-energy tibial pilon fractures were retrospectively reviewed after surgical treatment with primary ankle arthrodesis and fracture reduction. Average patient age was 58 years, and minimum followup was 6 months (average, 14 months; range, 6-22 months). Anatomically designed anterior ankle arthrodesis plates were used in 10 ankles. Ring external fixation was used in nine ankles with concomitant tibia fracture or in instances requiring additional fixation. Clinical evaluation included chart review, interview, the AOFAS ankle-hindfoot score, and radiographic evaluation. All of the ankle arthrodeses healed at an average of 4.4 months (range, 3-5 months). One patient had a nonunion at the metaphyseal fracture, which healed with revision surgery. The average AOFAS ankle-hindfoot score was 83 with 88% having an excellent or good result. Radiographic and clinical analysis confirmed a plantigrade foot without malalignment. No patients required revision surgery for malunion. Primary ankle arthrodesis combined with fracture reduction for the severely comminuted

  12. Contribution to the damping identification: experimental and numerical approaches

    International Nuclear Information System (INIS)

    Crambuer, R.

    2013-01-01

    Since earthquakes are a natural threat in France, it seems reasonable to construct buildings capable of resisting them. Since 1955, A.S. 55 recommendations regulations have taken into account this risk in all new constructions. The rules were created following an earthquake in Orleansville (Algeria) on 9. September 1954 and since then they have been modified in the aftermath of several significant earthquakes. As it stands now, the law requires that measurements of energy dissipation be carried out during the earthquakes in an effective manner. However, at present it is a great challenge to determine this, especially where reinforced concrete structures are concerned. The reason for this is the many different causes of energy dissipation which can be material, such as steel yielding, cracking of the concrete or deterioration of the interface between the Steel/concrete interface or environmental, such as the interactions with neighbouring structures or radiative damping. These dissipations typically creep into the essential pattern of the structures as a uniform, slight damping, and which is heavily quantify such as modal or Rayleigh damping. The challenge is therefore to ascertain how to carry out damping in a way that relies more on the laws of physics themselves. This study aims at bringing some clarifications to this problem. In order to achieve this, two objectives were targeted during the case study: the first consisted in experimentally qualifying and quantifying the sources of damping in concrete, the second aims at developing a method which model both the overall behaviour and the damping in a realistic way with low computational costs. A series of reverse 3-point bending tests were carried out to determine and quantify the mechanisms responsible for damping. This approach was innovative in that the tests were carried out on not only sound beams, but also on pre-damaged beams. When processing the results of these experiments, we focused on the overall

  13. Force-based and displacement-based reliability assessment approaches for highway bridges under multiple hazard actions

    Directory of Open Access Journals (Sweden)

    Chao Huang

    2015-08-01

    Full Text Available The strength limit state of American Association of State Highway and Transportation Officials (AASHTO Load and Resistance Factor Design (LRFD Bridge Design Specifications is developed based on the failure probabilities of the combination of non-extreme loads. The proposed design limit state equation (DLSE has been fully calibrated for dead load and live load by using the reliability-based approach. On the other hand, most of DLSEs in other limit states, including the extreme events Ⅰ and Ⅱ, have not been developed and calibrated though taking certain probability-based concepts into account. This paper presents an assessment procedure of highway bridge reliabilities under the limit state of extreme event Ⅰ, i. e., the combination of dead load, live load and earthquake load. A force-based approach and a displacement-based approach are proposed and implemented on a set of nine simplified bridge models. Results show that the displacement-based approach comes up with more convergent and accurate reliabilities for selected models, which can be applied to other hazards.

  14. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  15. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Directory of Open Access Journals (Sweden)

    Faten A Taki

    Full Text Available Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2 is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR. The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO, ovariectomized rats in the absence (OVX or presence of E2 (OVXE2. These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  16. A Markovian Approach Applied to Reliability Modeling of Bidirectional DC-DC Converters Used in PHEVs and Smart Grids

    Directory of Open Access Journals (Sweden)

    M. Khalilzadeh

    2016-12-01

    Full Text Available In this paper, a stochastic approach is proposed for reliability assessment of bidirectional DC-DC converters, including the fault-tolerant ones. This type of converters can be used in a smart DC grid, feeding DC loads such as home appliances and plug-in hybrid electric vehicles (PHEVs. The reliability of bidirectional DC-DC converters is of such an importance, due to the key role of the expected increasingly utilization of DC grids in modern Smart Grid. Markov processes are suggested for reliability modeling and consequently calculating the expected effective lifetime of bidirectional converters. A three-leg bidirectional interleaved converter using data of Toyota Prius 2012 hybrid electric vehicle is used as a case study. Besides, the influence of environment and ambient temperature on converter lifetime is studied. The impact of modeling the reliability of the converter and adding reliability constraints on the technical design procedure of the converter is also investigated. In order to investigate the effect of leg increase on the lifetime of the converter, single leg to five-leg interleave DC-DC converters are studied considering economical aspect and the results are extrapolated for six and seven-leg converters. The proposed method could be generalized so that the number of legs and input and output capacitors could be an arbitrary number.

  17. Experimental typography : reviewing the modernist and the current approaches

    OpenAIRE

    Makal, Eray

    1993-01-01

    Ankara : The Department of Graphic Design and Institute of Fine Arts, Bilkent Univ., 1993. Thesis (Master's) -- Bilkent University, 1993. Includes bibliographical references leaves 65-66. The intention of this study is to evaluate the experimental typography within the history of graphic design by taking in consideration of two epochs. The Modernist and The Current. Makal, Eray M.S.

  18. Disruptions in large value payment systems: an experimental approach

    NARCIS (Netherlands)

    Abbink, K.; Bosman, R.; Heijmans, R.; van Winden, F.

    2010-01-01

    This experimental study investigates the behaviour of banks in a large value payment system. More specifically,we look at 1) the reactions of banks to disruptions in the payment system, 2) the way in which the history of disruptions affects the behaviour of banks (path dependency) and 3) the effect

  19. Disruptions in large value payment systems: An experimental approach

    NARCIS (Netherlands)

    Abbink, K.; Bosman, R.; Heijmans, R.; van Winden, F.; Hellqvist, M.; Laine, T.

    2012-01-01

    This experimental study investigates the behaviour of banks in a large value payment system. More specifically, we look at 1) the reactions of banks to disruptions in the payment system, 2) the way in which the history of disruptions affects the behaviour of banks (path dependency) and 3) the effect

  20. In-Store Experimental Approach to Pricing and Consumer Behavior

    Science.gov (United States)

    Sigurdsson, Valdimar; Foxall, Gordon; Saevarsson, Hugi

    2010-01-01

    This study assessed how, and to what extent, it is possible to use behavioral experimentation and relative sales analysis to study the effects of price on consumers' brand choices in the store environment. An in-store experiment was performed in four stores to investigate the effects of different prices of a target brand on consumers' relative…

  1. Estimating reliability coefficients with heterogeneous item weightings using Stata: A factor based approach

    NARCIS (Netherlands)

    Boermans, M.A.; Kattenberg, M.A.C.

    2011-01-01

    We show how to estimate a Cronbach's alpha reliability coefficient in Stata after running a principal component or factor analysis. Alpha evaluates to what extent items measure the same underlying content when the items are combined into a scale or used for latent variable. Stata allows for testing

  2. Validity and reliability of three definitions of hip osteoarthritis: Cross sectional and longitudinal approach

    NARCIS (Netherlands)

    M. Reijman (Max); J.M.W. Hazes (Mieke); H.A.P. Pols (Huib); R.M.D. Bernsen (Roos); B.W. Koes (Bart); S.M. Bierma-Zeinstra (Sita)

    2004-01-01

    textabstractObjectives: To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the validity of the three

  3. A new approach for interexaminer reliability data analysis on dental caries calibration

    Directory of Open Access Journals (Sweden)

    Andréa Videira Assaf

    2007-12-01

    Full Text Available Objectives: a to evaluate the interexaminer reliability in caries detection considering different diagnostic thresholds and b to indicate, by using Kappa statistics, the best way of measuring interexaminer agreement during the calibration process in dental caries surveys. Methods: Eleven dentists participated in the initial training, which was divided into theoretical discussions and practical activities, and calibration exercises, performed at baseline, 3 and 6 months after the initial training. For the examinations of 6-7-year-old schoolchildren, the World Health Organization (WHO recommendations were followed and different diagnostic thresholds were used: WHO (decayed/missing/filled teeth - DMFT index and WHO + IL (initial lesion diagnostic thresholds. The interexaminer reliability was calculated by Kappa statistics, according to WHO and WHO+IL thresholds considering: a the entire dentition; b upper/lower jaws; c sextants; d each tooth individually. Results: Interexaminer reliability was high for both diagnostic thresholds; nevertheless, it decreased in all calibration sections when considering teeth individually. Conclusion: The interexaminer reliability was possible during the period of 6 months, under both caries diagnosis thresholds. However, great disagreement was observed for posterior teeth, especially using the WHO+IL criteria. Analysis considering dental elements individually was the best way of detecting interexaminer disagreement during the calibration sections.

  4. How Reliable Are Students' Evaluations of Teaching Quality? A Variance Components Approach

    Science.gov (United States)

    Feistauer, Daniela; Richter, Tobias

    2017-01-01

    The inter-rater reliability of university students' evaluations of teaching quality was examined with cross-classified multilevel models. Students (N = 480) evaluated lectures and seminars over three years with a standardised evaluation questionnaire, yielding 4224 data points. The total variance of these student evaluations was separated into the…

  5. Towards reliable multi-hop broadcast in VANETs : An analytical approach

    NARCIS (Netherlands)

    Gholibeigi, M.; Baratchi, M.; Berg, J.L. van den; Heijenk, G.

    2017-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  6. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    NARCIS (Netherlands)

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Geert

    2016-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  7. A generic Approach for Reliability Predictions considering non-uniformly Deterioration Behaviour

    International Nuclear Information System (INIS)

    Krause, Jakob; Kabitzsch, Klaus

    2012-01-01

    Predictive maintenance offers the possibility to prognosticate the remaining time until a maintenance action of a machine has to be scheduled. Unfortunately, current predictive maintenance solutions are only suitable for very specific use cases like reliability predictions based on vibration monitoring. Furthermore, they do not consider the fact that machines may deteriorate non-uniformly, depending on external influences (e.g., the work piece material in a milling machine or the changing fruit acid concentration in a bottling plant). In this paper two concepts for a generic predictive maintenance solution which also considers non-uniformly aging behaviour are introduced. The first concept is based on system models representing the health state of a technical system. As these models are usually statically (viz. without a timely dimension) their coefficients are determined periodically and the resulting time series is used as aging indicator. The second concept focuses on external influences (contexts) which change the behaviour of the previous mentioned aging indicators in order to increase the accuracy of reliability predictions. Therefore, context-depended time series models are determined and used to predict machine reliability. Both concepts were evaluated on data of an air ventilation system. Thereby, it could be shown that they are suitable to determine aging indicators in a generic way and to incorporate external influences in the reliability prediction. Through this, the quality of reliability predictions can be significantly increased. In reality this leads to a more accurate scheduling of maintenance actions. Furthermore, the generic character of the solutions makes the concepts suitable for a wide range of aging processes.

  8. A cyber-physical approach to experimental fluid mechanics

    Science.gov (United States)

    Mackowski, Andrew Williams

    This Thesis documents the design, implementation, and use of a novel type of experimental apparatus, termed Cyber-Physical Fluid Dynamics (CPFD). Unlike traditional fluid mechanics experiments, CPFD is a general-purpose technique that allows one to impose arbitrary forces on an object submerged in a fluid. By combining fluid mechanics with robotics, we can perform experiments that would otherwise be incredibly difficult or time-consuming. More generally, CPFD allows a high degree of automation and control of the experimental process, allowing for much more efficient use of experimental facilities. Examples of CPFD's capabilites include imposing a gravitational force in the horizontal direction (allowing a test object to "fall" sideways in a water channel), simulating nonlinear springs for a vibrating fluid-structure system, or allowing a self-propelled body to move forward under its own force. Because experimental parameters (including forces and even the mass of the test object) are defined in software, one can define entire ensembles of experiments to run autonomously. CPFD additionally integrates related systems such as water channel speed control, LDV flow speed measurements, and PIV flowfield measurements. The end result is a general-purpose experimental system that opens the door to a vast array of fluid-structure interaction problems. We begin by describing the design and implementation of CPFD, the heart of which is a high-performance force-feedback control system. Precise measurement of time-varying forces (including removing effects of the test object's inertia) is more critical here than in typical robotic force-feedback applications. CPFD is based on an integration of ideas from control theory, fluid dynamics, computer science, electrical engineering, and solid mechanics. We also describe experiments using the CPFD experimental apparatus to study vortex-induced vibration (VIV) and oscillating-airfoil propulsion. We show how CPFD can be used to simulate

  9. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  10. Fairness Concerns and Corrupt Decisions :an Experimental Approach

    OpenAIRE

    Epp, Lena; Leszczynska, Nastassia

    2017-01-01

    This study investigates the impact of a public officials' fairness considerations towards citizens in a petty corruption situation. Other-regarding preferences, and, more particularly, fairness concerns are widely acknowledged as crucial elements of individual economic decision-making. In petty corruption contexts, public officials are to a large extent aware of differences between citizens. Here, we experimentally investigate how fairness considerations may impact on corrupt behaviour. Our n...

  11. A novel experimental rat model of peripheral nerve scarring that reliably mimics post-surgical complications and recurring adhesions

    Directory of Open Access Journals (Sweden)

    Angela Lemke

    2017-08-01

    Full Text Available Inflammation, fibrosis and perineural adhesions with the surrounding tissue are common pathological processes following nerve injury and surgical interventions on peripheral nerves in human patients. These features can reoccur following external neurolysis, currently the most common surgical treatment for peripheral nerve scarring, thus leading to renewed nerve function impairment and chronic pain. To enable a successful evaluation of new therapeutic approaches, it is crucial to use a reproducible animal model that mimics the main clinical symptoms occurring in human patients. However, a clinically relevant model combining both histological and functional alterations has not been published to date. We therefore developed a reliable rat model that exhibits the essential pathological processes of peripheral nerve scarring. In our study, we present a novel method for the induction of nerve scarring by applying glutaraldehyde-containing glue that is known to cause nerve injury in humans. After a 3-week contact period with the sciatic nerve in female Sprague Dawley rats, we could demonstrate severe intra- and perineural scarring that resulted in grade 3 adhesions and major impairments in the electrophysiological peak amplitude compared with sham control (P=0.0478. Immunohistochemical analysis of the nerve structure revealed vigorous nerve inflammation and recruitment of T cells and macrophages. Also, distinct nerve degeneration was determined by immunostaining. These pathological alterations were further reflected in significant functional deficiencies, as determined by the analysis of relevant gait parameters as well as the quantification of the sciatic functional index starting at week 1 post-operation (P<0.01. Moreover, with this model we could, for the first time, demonstrate not only the primary formation, but also the recurrence, of severe adhesions 1 week after glue removal, imitating a major clinical challenge. As a comparison, we tested a

  12. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    OpenAIRE

    Raj Kumar Chopra; Varun Gupta; Durg Singh Chauhan

    2016-01-01

    Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analy...

  13. The REPAS approach to the evaluation of passive safety systems reliability

    International Nuclear Information System (INIS)

    Bianchi, F.; Burgazzi, L.; D'Auria, F.; Ricotti, M.E.

    2002-01-01

    Scope of this research, carried out by ENEA in collaboration with University of Pisa and Polytechnic of Milano since 1999, is the identification of a methodology allowing the evaluation of the reliability of passive systems as a whole, in a more physical and phenomenal way. The paper describe the study, named REPAS (Reliability Evaluation of Passive Safety systems), carried out by the partners and finalised to the development and validation of such a procedure. The strategy of engagement moves from the consideration that a passive system should be theoretically more reliable than an active one. In fact it does not need any external input or energy to operate and it relies only upon natural physical laws (e.g. gravity, natural circulation, internally stored energy, etc.) and/or 'intelligent' use of the energy inherently available in the system (e.g. chemical reaction, decay heat, etc.). Nevertheless the passive system may fail its mission not only as a consequence of classical mechanical failure of components, but also for deviation from the expected behaviour, due to physical phenomena mainly related to thermal-hydraulics or due to different boundary and initial conditions. The main sources of physical failure are identified and a probability of occurrence is assigned. The reliability analysis is performed on a passive system which operates in two-phase, natural circulation. The selected system is a loop including a heat source and a heat sink where the condensation occurs. The system behaviour under different configurations has been simulated via best-estimate code (Relap5 mod3.2). The results are shown and can be treated in such a way to give qualitative and quantitative information on the system reliability. Main routes of development of the methodology are also depicted. The analysis of the results shows that the procedure is suitable to evaluate the performance of a passive system on a probabilistic / deterministic basis. Important information can also be

  14. From integrated control to integrated farming, an experimental approach

    NARCIS (Netherlands)

    Vereijken, P.H.

    1989-01-01

    Integrated control or integrated pest management (IPM), as envisaged originally, is not being practised to any large extent in arable farming, notwithstanding considerable research efforts. The reasons for this are discussed. A more basic approach called integrated farming is suggested. Preliminary

  15. Substation design improvement with a probabilistic reliability approach using the TOPASE program

    Energy Technology Data Exchange (ETDEWEB)

    Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)

    1997-12-31

    TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.

  16. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  17. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    DEFF Research Database (Denmark)

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune

    2013-01-01

    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha......- and without monitoring. Monte Carlo Sampling (MCS) is used to estimate the probabilities and the tower of an onshore NREL 5MW wind turbine is given as a calculation case......A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage...

  18. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  19. Reliability calculation of cracked components using probabilistic fracture mechanics and a Markovian approach

    International Nuclear Information System (INIS)

    Schmidt, T.

    1988-01-01

    The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de

  20. Rating the raters in a mixed model: An approach to deciphering the rater reliability

    Science.gov (United States)

    Shang, Junfeng; Wang, Yougui

    2013-05-01

    Rating the raters has attracted extensive attention in recent years. Ratings are quite complex in that the subjective assessment and a number of criteria are involved in a rating system. Whenever the human judgment is a part of ratings, the inconsistency of ratings is the source of variance in scores, and it is therefore quite natural for people to verify the trustworthiness of ratings. Accordingly, estimation of the rater reliability will be of great interest and an appealing issue. To facilitate the evaluation of the rater reliability in a rating system, we propose a mixed model where the scores of the ratees offered by a rater are described with the fixed effects determined by the ability of the ratees and the random effects produced by the disagreement of the raters. In such a mixed model, for the rater random effects, we derive its posterior distribution for the prediction of random effects. To quantitatively make a decision in revealing the unreliable raters, the predictive influence function (PIF) serves as a criterion which compares the posterior distributions of random effects between the full data and rater-deleted data sets. The benchmark for this criterion is also discussed. This proposed methodology of deciphering the rater reliability is investigated in the multiple simulated and two real data sets.

  1. Validity and reliability of three definitions of hip osteoarthritis: cross sectional and longitudinal approach.

    Science.gov (United States)

    Reijman, M; Hazes, J M W; Pols, H A P; Bernsen, R M D; Koes, B W; Bierma-Zeinstra, S M A

    2004-11-01

    To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the validity of the three definitions of hip OA is sex dependent. from the Rotterdam study (aged > or= 55 years, n = 3585) were evaluated. The inter-rater reliability was tested in a random set of 148 x rays. The validity was expressed as the ability to identify patients who show clinical symptoms of hip OA (construct validity) and as the ability to predict total hip replacement (THR) at follow up (predictive validity). Inter-rater reliability was similar for the Kellgren and Lawrence grade and MJS (kappa statistics 0.68 and 0.62, respectively) but lower for Croft's grade (kappa statistic, 0.51). The Kellgren and Lawrence grade and MJS showed the strongest associations with clinical symptoms of hip OA. Sex appeared to be an effect modifier for Kellgren and Lawrence and MJS definitions, women showing a stronger association between grading and symptoms than men. However, the sex dependency was attributed to differences in height between women and men. The Kellgren and Lawrence grade showed the highest predictive value for THR at follow up. Based on these findings, Kellgren and Lawrence still appears to be a useful OA definition for epidemiological studies focusing on the presence of hip OA.

  2. A probabilistic approach to safety/reliability of space nuclear power systems

    International Nuclear Information System (INIS)

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  3. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  4. Are Children the Better Placebo Analgesia Responders? An Experimental Approach.

    Science.gov (United States)

    Wrobel, Nathalie; Fadai, Tahmine; Sprenger, Christian; Hebebrand, Johannes; Wiech, Katja; Bingel, Ulrike

    2015-10-01

    There is little information regarding changes in placebo responsiveness with age, although first predictors of placebo responders such as psychological and physiological processes have been identified. Reviews and meta-analyses indicate that placebo response rates in randomized controlled trials (RCTs) are higher in children and adolescents compared with adults. As these studies cannot control for age-dependent differences in the natural course of the disease, biases might contribute to different placebo rates in RCTs. To avoid these biases, this study investigated age-related differences in placebo responsiveness between children and adults in a well-established experimental model of placebo analgesia combining classic conditioning and expectation. Our data confirm placebo analgesic responses in children, which did not differ in magnitude from those of adults. The influence of previous experience on subsequent treatment outcome was stronger in children than in adults, indicating an increased relevance of learning processes for treatment outcomes in children. Further studies are needed to understand the influence of treatment-related learning processes in children and adolescents, which might critically determine treatment responsiveness during adulthood. This study is the first to experimentally explore placebo analgesia and influences of previous experience on placebo responses in children compared with adults. We found comparable placebo responses in both groups and an increased relevance of learning processes for treatment outcomes in children. Copyright © 2015 American Pain Society. Published by Elsevier Inc. All rights reserved.

  5. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  6. A New Two-Step Approach for Hands-On Teaching of Gene Technology: Effects on Students' Activities During Experimentation in an Outreach Gene Technology Lab

    Science.gov (United States)

    Scharfenberg, Franz-Josef; Bogner, Franz X.

    2011-08-01

    Emphasis on improving higher level biology education continues. A new two-step approach to the experimental phases within an outreach gene technology lab, derived from cognitive load theory, is presented. We compared our approach using a quasi-experimental design with the conventional one-step mode. The difference consisted of additional focused discussions combined with students writing down their ideas (step one) prior to starting any experimental procedure (step two). We monitored students' activities during the experimental phases by continuously videotaping 20 work groups within each approach ( N = 131). Subsequent classification of students' activities yielded 10 categories (with well-fitting intra- and inter-observer scores with respect to reliability). Based on the students' individual time budgets, we evaluated students' roles during experimentation from their prevalent activities (by independently using two cluster analysis methods). Independently of the approach, two common clusters emerged, which we labeled as `all-rounders' and as `passive students', and two clusters specific to each approach: `observers' as well as `high-experimenters' were identified only within the one-step approach whereas under the two-step conditions `managers' and `scribes' were identified. Potential changes in group-leadership style during experimentation are discussed, and conclusions for optimizing science teaching are drawn.

  7. Unilateral robotic hybrid mini-maze: a novel experimental approach.

    Science.gov (United States)

    Moslemi, Mohammad; Rawashdeh, Badi; Meyer, Mark; Nguyen, Duy; Poston, Robert; Gharagozloo, Farid

    2016-03-01

    A complete Cox maze IV procedure is difficult to accomplish using current endoscopic and minimally invasive techniques. These techniques are hampered by inability to adequately dissect the posterior structures of the heart and place all necessary lesions. We present a novel approach, using robotic technology, that achieves placement of all the lesions of the complete maze procedure. In three cadaveric human models, the technical feasibility of using robotic instruments through the right chest to dissect the posterior structures of the heart and place all Cox maze lesions was performed. The entire posterior aspect of the heart was dissected in the cadaveric model facilitating successful placement of all Cox maze IV lesions with robotic assistance through minimally invasive incisions. The robotic Cox maze IV procedure through the novel right thoracic approach is feasible. This obviates the need for sternotomy and avoids the associated morbidity of the conventional Cox-maze procedure. Copyright © 2015 John Wiley & Sons, Ltd.

  8. New approaches for the reliable in vitro assessment of binding affinity based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics.

    Science.gov (United States)

    Zeilinger, Markus; Pichler, Florian; Nics, Lukas; Wadsak, Wolfgang; Spreitzer, Helmut; Hacker, Marcus; Mitterhauser, Markus

    2017-12-01

    Resolving the kinetic mechanisms of biomolecular interactions have become increasingly important in early-phase drug development. Since traditional in vitro methods belong to dose-dependent assessments, binding kinetics is usually overlooked. The present study aimed at the establishment of two novel experimental approaches for the assessment of binding affinity of both, radiolabelled and non-labelled compounds targeting the A 3 R, based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics. A novel time-resolved competition assay was developed and applied to determine the K i of eight different A 3 R antagonists, using CHO-K1 cells stably expressing the hA 3 R. In addition, a new kinetic real-time cell-binding approach was established to quantify the rate constants k on and k off , as well as the dedicated K d of the A 3 R agonist [ 125 I]-AB-MECA. Furthermore, lipophilicity measurements were conducted to control influences due to physicochemical properties of the used compounds. Two novel real-time cell-binding approaches were successfully developed and established. Both experimental procedures were found to visualize the kinetic binding characteristics with high spatial and temporal resolution, resulting in reliable affinity values, which are in good agreement with values previously reported with traditional methods. Taking into account the lipophilicity of the A 3 R antagonists, no influences on the experimental performance and the resulting affinity were investigated. Both kinetic binding approaches comprise tracer administration and subsequent binding to living cells, expressing the dedicated target protein. Therefore, the experiments resemble better the true in vivo physiological conditions and provide important markers of cellular feedback and biological response.

  9. Experimental Approach of Fault Movement on an Engineered Barrier System

    International Nuclear Information System (INIS)

    Lee, Minsoo; Choi, Heuijoo; Kim, Heuna

    2012-01-01

    Safety evaluation of an engineered barrier system against fault movement at underground disposal region for high level waste (HLW) is tried using a miniature bore-shear apparatus. For the purpose, a miniature bore-shear apparatus simulating an EBS (engineered barrier system) was manufactured in 1/30 scale. And using the developed apparatus, bore-shear tests were performed twice. During the tests, pressure variations were checked at 6 points around buffer zone, and then a rotational angle of the test vessel was checked. The achieved pressure data were compared with those from analytical modeling, which is based on Drucker-Prager model. At initial shearing step, high pressure was recorded at some point but it decreased rapidly. For the better understanding of fault movement, the modification of an analytical model and the accumulation of experimental experience were required

  10. Experimental Approach of Fault Movement on an Engineered Barrier System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Minsoo; Choi, Heuijoo; Kim, Heuna [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-03-15

    Safety evaluation of an engineered barrier system against fault movement at underground disposal region for high level waste (HLW) is tried using a miniature bore-shear apparatus. For the purpose, a miniature bore-shear apparatus simulating an EBS (engineered barrier system) was manufactured in 1/30 scale. And using the developed apparatus, bore-shear tests were performed twice. During the tests, pressure variations were checked at 6 points around buffer zone, and then a rotational angle of the test vessel was checked. The achieved pressure data were compared with those from analytical modeling, which is based on Drucker-Prager model. At initial shearing step, high pressure was recorded at some point but it decreased rapidly. For the better understanding of fault movement, the modification of an analytical model and the accumulation of experimental experience were required.

  11. Flowability of granular materials with industrial applications - An experimental approach

    Science.gov (United States)

    Torres-Serra, Joel; Romero, Enrique; Rodríguez-Ferran, Antonio; Caba, Joan; Arderiu, Xavier; Padullés, Josep-Manel; González, Juanjo

    2017-06-01

    Designing bulk material handling equipment requires a thorough understanding of the mechanical behaviour of powders and grains. Experimental characterization of granular materials is introduced focusing on flowability. A new prototype is presented which performs granular column collapse tests. The device consists of a channel whose design accounts for test inspection using visualization techniques and load measurements. A reservoir is attached where packing state of the granular material can be adjusted before run-off to simulate actual handling conditions by fluidisation and deaeration of the pile. Bulk materials on the market, with a wide range of particle sizes, can be tested with the prototype and the results used for classification in terms of flowability to improve industrial equipment selection processes.

  12. Experimental approaches to predict allergenic potential of novel food

    DEFF Research Database (Denmark)

    Madsen, Charlotte Bernhard; Kroghsbo, Stine; Bøgh, Katrine Lindholm

    2013-01-01

    ’t know under what circumstances oral tolerance develops. With all these unanswered questions, it is a big challenge to designan animal model that, with relatively few animals, is able to predict if a food protein is a potential allergen. An even larger challenge is to predict its potency, a prerequisite...... for risk evaluation.Attempts have been made to rank proteins according to their allergenic potency based on the magnitude of the IgE response in experimental animals. This ranking has not included abundance as a parameter. We may be able to predict potential allergenicity i.e. hazard but our lack......There are many unanswered questions relating to food allergy sensitization in humans. We don’t know under what circumstances sensitization takes place i.e. route (oral, dermal, respiratory), age, dose, frequencyof exposure, infection or by-stander effect of other allergens. In addition we don...

  13. An Indication of Reliability of the Two-Level Approach of the AWIN Welfare Assessment Protocol for Horses

    Directory of Open Access Journals (Sweden)

    Irena Czycholl

    2018-01-01

    Full Text Available To enhance feasibility, the Animal Welfare Indicators (AWIN assessment protocol for horses consists of two levels: the first is a visual inspection of a sample of horses performed from a distance, the second a close-up inspection of all horses. The aim was to analyse whether information would be lost if only the first level were performed. In this study, 112 first and 112 second level assessments carried out on a subsequent day by one observer were compared by calculating the Spearman’s Rank Correlation Coefficient (RS, Intraclass Correlation Coefficients (ICC, Smallest Detectable Changes (SDC and Limits of Agreements (LoA. Most indicators demonstrated sufficient reliability between the two levels. Exceptions were the Horse Grimace Scale, the Avoidance Distance Test and the Voluntary Human Approach Test (e.g., Voluntary Human Approach Test: RS: 0.38, ICC: 0.38, SDC: 0.21, LoA: −0.25–0.17, which could, however, be also interpreted as a lack of test-retest reliability. Further disagreement was found for the indicator consistency of manure (RS: 0.31, ICC: 0.38, SDC: 0.36, LoA: −0.38–0.36. For these indicators, an adaptation of the first level would be beneficial. Overall, in this study, the division into two levels was reliable and might therewith have the potential to enhance feasibility in other welfare assessment schemes.

  14. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    Science.gov (United States)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  15. Pipeline integrity model-a formative approach towards reliability and life assessment

    International Nuclear Information System (INIS)

    Sayed, A.M.; Jaffery, M.A.

    2005-01-01

    Pipe forms an integral part of transmission medium in oil and gas industry. This holds true for both upstream and downstream segments of this global energy business. With the aging of this asset base, emphasis on its operational aspects has been under immense considerations from the operators and regulators sides. Moreover, the milieu of information area and enhancement in global trade has lifted the barriers on means to forge forward towards better utilization of resources. This has resulted in optimized solutions as priority for business and technical manager's world over. There is a paradigm shift from mere development of 'smart materials' to 'low life cycle cost material'. The force inducing this change is a rationale one: the recovery of development costs is no more a problem in a global community; rather it is the pay-off time which matters most to the materials end users. This means that decision makers are not evaluating just the price offered but are keen to judge the entire life cycle cost of a product. The integrity of pipe are affected by factors such as corrosion, fatigue-crack growth, stress-corrosion cracking, and mechanical damage. Extensive research in the area of reliability and life assessment has been carried out. A number of models concerning with the reliability issues of pipes have been developed and are being used by a number of pipeline operators worldwide. Yet, it is emphasised that there are no substitute for sound engineering judgment and allowance for factors of safety. The ability of a laid down pipe network to transport the intended fluid under pre-defined conditions for the entire project envisaged life, is referred to the reliability of system. The reliability is built into the product through extensive benchmarking against industry standard codes. The process of pipes construction for oil and gas service is regulated through American Petroleum Institute's Specification for Line Pipe. Subsequently, specific programs have been

  16. Fault-tolerant design approach for reliable offshore multi-megawatt variable frequency converters

    Directory of Open Access Journals (Sweden)

    N. Vedachalam

    2016-09-01

    Full Text Available Inverters play a key role in realizing reliable multi-megawatt power electronic converters used in offshore applications, as their failure leads to production losses and impairs safety. The performance of high power handing semiconductor devices with high speed control capabilities and redundant configurations helps in realizing a fault-tolerant design. This paper describes the reliability modeling done for an industry standard, 3-level neutral point clamped multi-megawatt inverter, the significance of semiconductor redundancy in reducing inverter failure rates, and proposes methods for achieving static and dynamic redundancy in series connected press pack type insulated gate bipolar transistors (IGBT. It is identified that, with the multi megawatt inverter having 3+2 IGBT in each half leg with dynamic redundancy incorporated, it is possible to reduce the failure rate of the inverter from 53.8% to 15% in 5 years of continuous operation. The simulation results indicate that with dynamic redundancy, it is possible to force an untriggered press pack IGBT to short circuit in <1s, when operated with a pulse width modulation frequency of 1kHz.

  17. Placebo and Nocebo Effects in Sexual Medicine: An Experimental Approach.

    Science.gov (United States)

    Kruger, Tillmann H C; Grob, Carolin; de Boer, Claas; Peschel, Thomas; Hartmann, Uwe; Tenbergen, Gilian; Schedlowski, Manfred

    2016-11-16

    Few studies have investigated placebo and nocebo effects in a human sexuality context. Studying placebo and nocebo responses in this context may provide insight into their potential to modulate sexual drive and function. To examine such effects in sexual medicine, 48 healthy, male heterosexual participants were divided into four groups. Each group received instruction to expect stimulating effects, no effect, or an inhibitory effect on sexual functions. Only one group received the dopamine agonist cabergoline; all other groups received placebo or nocebo. Modulations in sexual experience were examined through an established experimental paradigm of sexual arousal and masturbation-induced orgasm during erotic film sequences with instruction to induce placebo or nocebo effects. Endocrine data, appetitive, consummatory, and refractory sexual behavior parameters were assessed using the Arizona Sexual Experience Scale (ASEX) and the Acute Sexual Experience Scale (ASES). Results showed increased levels of sexual function after administration of cabergoline with significant effects for several parameters. Placebo effects were induced only to a small degree. No negative effects on sexual parameters in the nocebo condition were noted. This paradigm could induce only small placebo and nocebo effects. This supports the view that healthy male sexual function seems relatively resistant to negative external influences.

  18. Charge Transport in LDPE Nanocomposites Part I—Experimental Approach

    Directory of Open Access Journals (Sweden)

    Anh T. Hoang

    2016-03-01

    Full Text Available This work presents results of bulk conductivity and surface potential decay measurements on low-density polyethylene and its nanocomposites filled with uncoated MgO and Al2O3, with the aim to highlight the effect of the nanofillers on charge transport processes. Material samples at various filler contents, up to 9 wt %, were prepared in the form of thin films. The performed measurements show a significant impact of the nanofillers on reduction of material’s direct current (dc conductivity. The investigations thus focused on the nanocomposites having the lowest dc conductivity. Various mechanisms of charge generation and transport in solids, including space charge limited current, Poole-Frenkel effect and Schottky injection, were utilized for examining the experimental results. The mobilities of charge carriers were deduced from the measured surface potential decay characteristics and were found to be at least two times lower for the nanocomposites. The temperature dependencies of the mobilities were compared for different materials.

  19. Thermochemical study of cyanopyrazines: Experimental and theoretical approaches

    International Nuclear Information System (INIS)

    Miranda, Margarida S.; Morais, Victor M.F.; Matos, M. Agostinha R.

    2006-01-01

    The standard (p - bar =0.1MPa) molar energy of combustion, at T=298.15K, of crystalline 2,3-dicyanopyrazine was measured by static bomb calorimetry, in oxygen atmosphere. The standard molar enthalpy of sublimation, at T=298.15K, was obtained by Calvet Microcalorimetry, allowing the calculation of the standard molar enthalpy of formation of the compound, in the gas phase, at T=298.15K: Δ f H m - bar (g)=(518.7+/-3.4)kJ.mol -1 . In addition, the geometries of all cyanopyrazines were obtained using density functional theory with the B3LYP functional and two basis sets: 6-31G* and 6-311G**. These calculations were then used for a better understanding of the relation between structure and energetics of the cyanopyrazine systems. These calculations also reproduce measured standard molar enthalpies of formation with some accuracy and do provide estimates of this thermochemical parameter for those compounds that could not be studied experimentally, namely the tri- and tetracyanopyrazines: the strong electron withdrawing cyano group on the pyrazine ring makes cyanopyrazines highly destabilized compounds

  20. Functional complexity and ecosystem stability: an experimental approach

    Energy Technology Data Exchange (ETDEWEB)

    Van Voris, P.; O' Neill, R.V.; Shugart, H.H.; Emanuel, W.R.

    1978-01-01

    The complexity-stability hypothesis was experimentally tested using intact terrestrial microcosms. Functional complexity was defined as the number and significance of component interactions (i.e., population interactions, physical-chemical reactions, biological turnover rates) influenced by nonlinearities, feedbacks, and time delays. It was postulated that functional complexity could be nondestructively measured through analysis of a signal generated from the system. Power spectral analysis of hourly CO/sub 2/ efflux, from eleven old-field microcosms, was analyzed for the number of low frequency peaks and used to rank the functional complexity of each system. Ranking of ecosystem stability was based on the capacity of the system to retain essential nutrients and was measured by net loss of Ca after the system was stressed. Rank correlation supported the hypothesis that increasing ecosystem functional complexity leads to increasing ecosystem stability. The results indicated that complex functional dynamics can serve to stabilize the system. The results also demonstrated that microcosms are useful tools for system-level investigations.

  1. Organophosphorous pesticides research in Mexico: epidemiological and experimental approaches.

    Science.gov (United States)

    Sánchez-Guerra, M; Pérez-Herrera, N; Quintanilla-Vega, B

    2011-11-01

    Non-persistent pesticides, such as organophosphorous (OP) insecticides have been extensively used in Mexico, and becoming a public health problem. This review presents data of OP use and related toxicity from epidemiological and experimental studies conducted in Mexico. Studies in agricultural workers from several regions of the country reported moderate to severe cholinergic symptoms, including decreased acetylcholinesterase (AChE) activity (the main acute OP toxic effect that causes an over accumulation of the neurotransmitter acetylcholine), revealing the potential risk of intoxication of Mexican farmers. OP exposure in occupational settings has been associated with decreased semen quality, sperm DNA damage and as endocrine disrupter, particularly in agricultural workers. Alterations in female reproductive function have also been observed, as well as adverse effects on embryo development by prenatal exposure in agricultural communities. This illustrates that OP exposure represents a risk for reproduction and offspring well-being in Mexico. The genotoxic effects of this group of pesticides in somatic and sperm cells are also documented. Lastly, we present data about gene-environmental interactions regarding OP metabolizing enzymes, such as paraoxonase-1 (PON1) and its role in modulating their toxicity, particularly on semen quality and sperm DNA integrity. In summary, readers will see the important health problems associated with OP exposure in Mexican populations, thereby the need of capacitation programs to communicate farmers the proper handling of agrochemicals to prevent their toxic effects and of more well designed human studies to support data of the current situation of workers and communities dedicated to agriculture activities.

  2. Experimental Economics on Firm’s Behavior: Entry Game Approach

    Directory of Open Access Journals (Sweden)

    I Wayan Sukadana

    2015-11-01

    Full Text Available The paper analyzes subject’s behavior in evolutionary process of entry game. The experiment is designed to analyze the behavior of the subject. The experiment is set in sequential entry games. Process of the game was conducted under asymmetric information, uncertainty, payoff perturbation and random matching. The subjects of the experiment were students of the Universitas Udayana, Bali, Indonesia. Subjects who play as new-entrance firms tend to choose “stay-out” strategy when the uncertainty and the amount of loss increase. Meanwhile, the subjects who play as an incumbent firm, which set to have more information about the game (market rather than the new-entrances, most of the time abuse their position by choose “threat” strategy, which not a credible strategy for some of them. Experimental studies shows that New-entrance tend to weight more on lost when the risk increase (from risk averse setting to risk seeking setting, and tend to choose sure value over a lotteries although the expected value from lotteries is higher or the same. These findings support the reason that Indonesian youngsters tend to choose a job as a PNS or employee of existing firms. The results also support the reason that Indonesian businessman is more willing to open a new business if they have a guarantee for their losses.

  3. Identification of energy storage rate components. Theoretical and experimental approach

    International Nuclear Information System (INIS)

    Oliferuk, W; Maj, M

    2010-01-01

    The subject of the present paper is decomposition of energy storage rate into terms related to different mode of deformation. The stored energy is the change in internal energy due to plastic deformation after specimen unloading. Hence, this energy describes the state of the cold-worked material. Whereas, the ratio of the stored energy increment to the appropriate increment of plastic work is the measure of energy conversion process. This ratio is called the energy storage rate. Experimental results show that the energy storage rate is dependent on plastic strain. This dependence is influenced by different microscopic deformation mechanisms. It has been shown that the energy storage rate can be presented as a sum of particular components. Each of them is related to the separate internal microscopic mechanism. Two of the components are identified. One of them is the storage rate of statistically stored dislocation energy related to uniform deformation. Another one is connected with non-uniform deformation at the grain level. It is the storage rate of the long range stresses energy and geometrically necessary dislocation energy. The maximum of energy storage rate, that appeared at initial stage of plastic deformation is discussed in terms of internal micro-stresses.

  4. Electrochemistry of moexipril: experimental and computational approach and voltammetric determination.

    Science.gov (United States)

    Taşdemir, Hüdai I; Kiliç, E

    2014-09-01

    The electrochemistry of moexipril (MOE) was studied by electrochemical methods with theoretical calculations performed at B3LYP/6-31 + G (d)//AM1. Cyclic voltammetric studies were carried out based on a reversible and adsorption-controlled reduction peak at -1.35 V on a hanging mercury drop electrode (HMDE). Concurrently irreversible diffusion-controlled oxidation peak at 1.15 V on glassy carbon electrode (GCE) was also employed. Potential values are according to Ag/AgCI, (3.0 M KCI) and measurements were performed in Britton-Robinson buffer of pH 5.5. Tentative electrode mechanisms were proposed according to experimental results and ab-initio calculations. Square-wave adsorptive stripping voltammetric methods have been developed and validated for quantification of MOE in pharmaceutical preparations. Linear working range was established as 0.03-1.35 microM for HMDE and 0.2-20.0 microM for GCE. Limit of quantification (LOQ) was calculated to be 0.032 and 0.47 microM for HMDE and GCE, respectively. Methods were successfully applied to assay the drug in tablets by calibration and standard addition methods with good recoveries between 97.1% and 106.2% having relative standard deviation less than 10%.

  5. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Taghezouit, Bilal; Saidi, Ahmed; Hamlati, Mohamed-Elkarim

    2017-01-01

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one

  6. Photodynamic therapy: Theoretical and experimental approaches to dosimetry

    Science.gov (United States)

    Wang, Ken Kang-Hsin

    Singlet oxygen (1O2) is the major cytotoxic species generated during photodynamic therapy (PDT), and 1O 2 reactions with biological targets define the photodynamic dose at the most fundamental level. We have developed a theoretical model for rigorously describing the spatial and temporal dynamics of oxygen (3O 2) consumption and transport and microscopic 1O 2 dose deposition during PDT in vivo. Using experimentally established physiological and photophysical parameters, the mathematical model allows computation of the dynamic variation of hemoglobin-3O 2 saturation within vessels, irreversible photosensitizer degradation due to photobleaching, therapy-induced blood flow decrease and the microscopic distributions of 3O2 and 1O 2 dose deposition under various irradiation conditions. mTHPC, a promising photosensitizer for PDT, is approved in Europe for the palliative treatment of head and neck cancer. Using the theoretical model and informed by intratumor sensitizer concentrations and distributions, we calculated photodynamic dose depositions for mTHPC-PDT. Our results demonstrate that the 1O 2 dose to the tumor volume does not track even qualitatively with long-term tumor responses. Thus, in this evaluation of mTHPC-PDT, any PDT dose metric that is proportional to singlet oxygen creation and/or deposition would fail to predict the tumor response. In situations like this one, other reporters of biological response to therapy would be necessary. In addition to the case study of mTHPC-PDT, we also use the mathematical model to simulate clinical photobleaching data, informed by a possible blood flow reduction during treatment. In a recently completed clinical trial at Roswell Park Cancer Institute, patients with superficial basal cell carcinoma received topical application of 5-aminolevulinic acid (ALA) and were irradiated with 633 nm light at 10-150 mW cm-2 . Protoporphyrin IX (PpIX) photobleaching in the lesion and the adjacent perilesion normal margin was monitored by

  7. Solidification effects on sill formation: An experimental approach

    Science.gov (United States)

    Chanceaux, L.; Menand, T.

    2014-10-01

    Sills represent a major mechanism for constructing continental Earth's crust because these intrusions can amalgamate and form magma reservoirs and plutons. As a result, numerous field, laboratory and numerical studies have investigated the conditions that lead to sill emplacement. However, all previous studies have neglected the potential effect magma solidification could have on sill formation. The effects of solidification on the formation of sills are studied and quantified with scaled analogue laboratory experiments. The experiments presented here involved the injection of hot vegetable oil (a magma analogue) which solidified during its propagation as a dyke in a colder and layered solid of gelatine (a host rock analogue). The gelatine solid had two layers of different stiffness, to create a priori favourable conditions to form sills. Several behaviours were observed depending on the injection temperature and the injection rate: no intrusions (extreme solidification effects), dykes stopping at the interface (high solidification effects), sills (moderate solidification effects), and dykes passing through the interface (low solidification effects). All these results can be explained quantitatively as a function of a dimensionless temperature θ, which describes the experimental thermal conditions, and a dimensionless flux ϕ, which describes their dynamical conditions. The experiments reveal that sills can only form within a restricted domain of the (θ , ϕ) parameter space. These experiments demonstrate that contrary to isothermal experiments where cooling could not affect sill formation, the presence of an interface that would be a priori mechanically favourable is not a sufficient condition for sill formation; solidification effects restrict sill formation. The results are consistent with field observations and provide a means to explain why some dykes form sills when others do not under seemingly similar geological conditions.

  8. An optimization-based control approach for reliable microgrid energy management under uncertainties

    OpenAIRE

    Prodan , Ionela; Zio , Enrico

    2013-01-01

    International audience; This paper proposes an optimizationbased control approach for microgrid energy management. For exemplification of the approach consider a microgrid system connected to an external grid via a transformer and containing a local consumer, a renewable generator (wind turbine) and a storage facility (battery). The objective of minimizing the costs is achieved through a predictive control framework for scheduling the battery usage in the microgrid system. The proposed contro...

  9. A System Approach to Advanced Practice Clinician Standardization and High Reliability.

    Science.gov (United States)

    Okuno-Jones, Susan; Siehoff, Alice; Law, Jennifer; Juarez, Patricia

    Advanced practice clinicians (APCs) are an integral part of the health care team. Opportunities exist within Advocate Health Care to standardize and optimize APC practice across the system. To enhance the role and talents of APCs, an approach to role definition and optimization of practice and a structured approach to orientation and evaluation are shared. Although in the early stages of development, definition and standardization of accountabilities in a framework to support system changes are transforming the practice of APCs.

  10. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  11. A least squares approach for efficient and reliable short-term versus long-term optimization

    DEFF Research Database (Denmark)

    Christiansen, Lasse Hjuler; Capolei, Andrea; Jørgensen, John Bagterp

    2017-01-01

    The uncertainties related to long-term forecasts of oil prices impose significant financial risk on ventures of oil production. To minimize risk, oil companies are inclined to maximize profit over short-term horizons ranging from months to a few years. In contrast, conventional production...... optimization maximizes long-term profits over horizons that span more than a decade. To address this challenge, the oil literature has introduced short-term versus long-term optimization. Ideally, this problem is solved by a posteriori multi-objective optimization methods that generate an approximation...... the balance between the objectives, leaving an unfulfilled potential to increase profits. To promote efficient and reliable short-term versus long-term optimization, this paper introduces a natural way to characterize desirable Pareto points and proposes a novel least squares (LS) method. Unlike hierarchical...

  12. Alternative approaches to reliability modeling of a multiple engineered barrier system

    International Nuclear Information System (INIS)

    Ananda, M.M.A.; Singh, A.K.

    1994-01-01

    The lifetime of the engineered barrier system used for containment of high-level radioactive waste will significantly impact the total performance of a geological repository facility. Currently two types of designs are under consideration for an engineered barrier system, single engineered barrier system and multiple engineered barrier system. Multiple engineered barrier system consists of several metal barriers and the waste form (cladding). Some recent work show that a significant improvement of performance can be achieved by utilizing multiple engineered barrier systems. Considering sequential failures for each barrier, we model the reliability of the multiple engineered barrier system. Weibull and exponential lifetime distributions are used through out the analysis. Furthermore, the number of failed engineered barrier systems in a repository at a given time is modeled using a poisson approximation

  13. Experimental Approaches to Study Genome Packaging of Influenza A Viruses

    Directory of Open Access Journals (Sweden)

    Catherine Isel

    2016-08-01

    Full Text Available The genome of influenza A viruses (IAV consists of eight single-stranded negative sense viral RNAs (vRNAs encapsidated into viral ribonucleoproteins (vRNPs. It is now well established that genome packaging (i.e., the incorporation of a set of eight distinct vRNPs into budding viral particles, follows a specific pathway guided by segment-specific cis-acting packaging signals on each vRNA. However, the precise nature and function of the packaging signals, and the mechanisms underlying the assembly of vRNPs into sub-bundles in the cytoplasm and their selective packaging at the viral budding site, remain largely unknown. Here, we review the diverse and complementary methods currently being used to elucidate these aspects of the viral cycle. They range from conventional and competitive reverse genetics, single molecule imaging of vRNPs by fluorescence in situ hybridization (FISH and high-resolution electron microscopy and tomography of budding viral particles, to solely in vitro approaches to investigate vRNA-vRNA interactions at the molecular level.

  14. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    Science.gov (United States)

    Morrison, Geoffrey Stewart

    2014-05-01

    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.

  15. A New Approach for Analyzing the Reliability of the Repair Facility in a Series System with Vacations

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2012-01-01

    Full Text Available Based on the renewal process theory we develop a decomposition method to analyze the reliability of the repair facility in an n-unit series system with vacations. Using this approach, we study the unavailability and the mean replacement number during (0,t] of the repair facility. The method proposed in this work is novel and concise, which can make us see clearly the structures of the facility indices of a series system with an unreliable repair facility, two convolution relations. Special cases and numerical examples are given to show the validity of our method.

  16. Reliability-based approaches for safety margin assessment in the French nuclear industry

    International Nuclear Information System (INIS)

    Ardillon, E.; Barthelet, B.; Meister, E.; Cambefort, P.; Hornet, P.; Le Delliou, P.

    2003-01-01

    The prevention of the fast fracture damage of the mechanical equipment important for the safety of nuclear islands of the French PWR relies on deterministic rules. These rules include flaw acceptance criteria involving safety factors applied to characteristic values (implicit margins) of the physical variables. The sets of safety factors that are currently under application in the industrial analyses with the agreement of the Safety Authority, are distributed across the two main physical parameters and have partly been based on a semi-probabilistic approach. After presenting the generic probabilistic pro-codification approach this paper shows its application to the evaluation of the performances of the existing regulatory flaw acceptance criteria. This application can be carried out in a realistic manner or in a more simplified one. These two approaches are applied to representative mechanical components. Their results are consistent. (author)

  17. Reliable experimental setup to test the pressure modulation of Baerveldt Implant tubes for reducing post-operative hypotony

    Science.gov (United States)

    Ramani, Ajay

    future to evaluate custom inserts and their effects on the pressure drop over 4 -- 6 weeks. The design requirements were: simulate physiological conditions [flow rate between 1.25 and 2.5 mul/min], evaluate small inner diameter tubes [50 and 75 mum] and annuli, and demonstrate reliability and repeatability. The current study was focused on benchmarking the experimental setup for the IOP range of 15 -- 20 mm Hg. Repeated experiments have been conducted using distilled water with configurations [diameter of tube, insert diameter, lengths of insert and tube, and flow rate] that produce pressure variations which include the 15 -- 20 mm Hg range. Two similar setups were assembled and evaluated for repeatability between the two. Experimental measurements of pressure drop were validated using theoretical calculations. Theory predicted a range of expected values by considering manufacturing and performance tolerances of the apparatus components: tube diameter, insert diameter, and the flow-rate and pressure [controlled by pump]. In addition, preliminary experiments evaluated the dissolution of suture samples in a balanced salt solution and in distilled water. The balanced salt solution approximates the eye's aqueous humor properties, and it was expected that the salt and acid would help to hydrolyze sutures much faster than distilled water. Suture samples in a balanced salt solution showed signs of deterioration [flaking] within 23 days, and distilled water samples showed only slight signs of deterioration after about 30 days. These preliminary studies indicate that future dissolution and flow experiments should be conducted using the balanced salt solution. Also, the absorbable sutures showed signs of bulk erosion/deterioration in a balanced salt solution after 14 days, which indicates that they may not be suitable as inserts in the implant tubes because flakes could block the tube entrance. Further long term studies should be performed in order to understand the effects of

  18. Toward formal analysis of ultra-reliable computers: A total systems approach

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1986-01-01

    This paper describes the application of modeling and analysis techniques to software that is designed to execute on four channel version of the the Charles Stark Draper Laboratory (CSDL) Fault-Tolerant Processor, referred to as the Draper FTP. The software performs sensor validation of four independent measures (singlas) from the primary pumps of the Experimental Breeder Reactor-II operated by Argonne National Laboratory-West, and from the validated signals formulates a flow trip signal for the reactor safety system. 11 refs., 4 figs

  19. Empirical evidence of bias in the design of experimental stroke studies - A metaepidemiologic approach

    NARCIS (Netherlands)

    Crossley, Nicolas A.; Sena, Emily; Goehler, Jos; Horn, Jannekke; van der Worp, Bart; Bath, Philip M. W.; Macleod, Malcolm; Dirnagl, Ulrich

    2008-01-01

    Background and Purpose - At least part of the failure in the transition from experimental to clinical studies in stroke has been attributed to the imprecision introduced by problems in the design of experimental stroke studies. Using a metaepidemiologic approach, we addressed the effect of

  20. DG Allocation Based on Reliability, Losses and Voltage Sag Considerations: an expert system approach

    Directory of Open Access Journals (Sweden)

    Sahar Abdel Moneim Moussa

    2017-03-01

    Full Text Available Expert System (ES as a branch of Artificial Intelligence (AI methodology can potentially help in solving complicated power system problems. This may be more appropriate methodology than conventional optimization techniques when contradiction between objectives appears in reaching the optimum solution. When this contradiction is the hindrance in reaching the required system operation through the application of traditional methods ES can give a hand in such case. In this paper, the  knowledge- based ES technique is proposed to reach near-optimum solution which is further directed to the optimum solution through particle swarm optimization (PSO technique. This idea is known as Hybrid-Expert-System (HES. The proposed idea is used in getting the optimum allocation of a number of distributed generation (DG units on Distribution System (DS busbars taking into consideration three issues; reliability, voltage sag, and line losses. Optimality is assessed on the economic basis by calculating money benefits (or losses resulting from DG addition considering the three aforementioned issues. The effectiveness of the proposed technique is ascertained through example.

  1. Computational approaches to standard-compliant biofilm data for reliable analysis and integration.

    Science.gov (United States)

    Sousa, Ana Margarida; Ferreira, Andreia; Azevedo, Nuno F; Pereira, Maria Olivia; Lourenço, Anália

    2012-12-01

    The study of microorganism consortia, also known as biofilms, is associated to a number of applications in biotechnology, ecotechnology and clinical domains. Nowadays, biofilm studies are heterogeneous and data-intensive, encompassing different levels of analysis. Computational modelling of biofilm studies has become thus a requirement to make sense of these vast and ever-expanding biofilm data volumes. The rationale of the present work is a machine-readable format for representing biofilm studies and supporting biofilm data interchange and data integration. This format is supported by the Biofilm Science Ontology (BSO), the first ontology on biofilms information. The ontology is decomposed into a number of areas of interest, namely: the Experimental Procedure Ontology (EPO) which describes biofilm experimental procedures; the Colony Morphology Ontology (CMO) which characterises morphologically microorganism colonies; and other modules concerning biofilm phenotype, antimicrobial susceptibility and virulence traits. The overall objective behind BSO is to develop semantic resources to capture, represent and share data on biofilms and related experiments in a regularized fashion manner. Furthermore, the present work also introduces a framework in assistance of biofilm data interchange and analysis - BiofOmics (http://biofomics.org) - and a public repository on colony morphology signatures - MorphoCol (http://stardust.deb.uminho.pt/morphocol).

  2. A Reliable, Efficient, Affordable and User-friendly Approach for Online Assessment in Distance Education

    OpenAIRE

    Mardanian, Haleh; Mozelius, Peter

    2011-01-01

    In the assessment of the students in higher education, cheating and plagiarism have always been of major problem. This problem is also a rapidly growing phenomenon in Sweden. The number of students suspended from courses in tertiary education increased by 56% in 2010 and the plagiarism has been the most common violation. Online distance courses with students spread out geographically need online assessment approaches to save time and avoid travel expenses. E-learning and distance education ha...

  3. A two-stage approach for multi-objective decision making with applications to system reliability optimization

    International Nuclear Information System (INIS)

    Li Zhaojun; Liao Haitao; Coit, David W.

    2009-01-01

    This paper proposes a two-stage approach for solving multi-objective system reliability optimization problems. In this approach, a Pareto optimal solution set is initially identified at the first stage by applying a multiple objective evolutionary algorithm (MOEA). Quite often there are a large number of Pareto optimal solutions, and it is difficult, if not impossible, to effectively choose the representative solutions for the overall problem. To overcome this challenge, an integrated multiple objective selection optimization (MOSO) method is utilized at the second stage. Specifically, a self-organizing map (SOM), with the capability of preserving the topology of the data, is applied first to classify those Pareto optimal solutions into several clusters with similar properties. Then, within each cluster, the data envelopment analysis (DEA) is performed, by comparing the relative efficiency of those solutions, to determine the final representative solutions for the overall problem. Through this sequential solution identification and pruning process, the final recommended solutions to the multi-objective system reliability optimization problem can be easily determined in a more systematic and meaningful way.

  4. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment

    International Nuclear Information System (INIS)

    Mahato, Niladri K.; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-01-01

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T 1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T 1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  5. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment.

    Science.gov (United States)

    Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-05-18

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  6. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  7. Personal Reflections on Observational and Experimental Research Approaches to Childhood Psychopathology

    Science.gov (United States)

    Rapoport, Judith L.

    2009-01-01

    The past 50 years have seen dramatic changes in childhood psychopathology research. The goal of this overview is to contrast observational and experimental research approaches; both have grown more complex such that the boundary between these approaches may be blurred. Both are essential. Landmark observational studies with long-term follow-up…

  8. An approach for the condensed presentation of intuitive citation impact metrics which remain reliable with very few publications

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, D.; Tippett, Ch.; Côté, G.; Roberge, G.; Archambault, E.

    2016-07-01

    An approach for presenting citation data in a condensed and intuitive manner which will allow for their reliable interpretation by policy analysts even in cases where the number of peer-reviewed publications produced by a given entity remains small is presented. The approach is described using country level data in Agronomy & Agriculture (2004–2013), an area of specialisation for many developing countries with a small output size. Four citation impact metrics, and a synthesis graph that we call the distributional micro-charts of relative citation counts, are considered in building our “preferred” presentation layout. These metrics include two indicators that have long been used by Science-Metrix in its bibliometric reports, the Average of Relative Citations (ARC) and the percentage of publications in the 10% most cited publications in the database (HCP), as well as two newer metrics, the Median of Relative Citations (MRC) and the Relative Integration Score (RIS). The findings reveal that the proposed approach combining the MRC and HCP with the distributional micro-charts effectively allows to better qualify the citation impact of entities in terms of central location, density of the upper citation tail and overall distribution than Science-Metrix former approach based on the ARC and HCP. This is especially true of cases with small population sizes where a strong presence of outliers (denoted by strong HCP scores) can have a significant effect on the central location of the citation data when estimated with an average. (Author)

  9. An approach to evaluating system well-being in engineering reliability applications

    International Nuclear Information System (INIS)

    Billinton, Roy; Fotuhi-Firuzabad, Mahmud; Aboreshaid, Saleh

    1995-01-01

    This paper presents an approach to evaluating the degree of system well-being of an engineering system. The functionality of the system is identified by healthy, marginal and risk states. The state definitions permit the inclusion of deterministic considerations in the probabilistic indices used to monitor the system well-being. A technique is developed to determine the three operating state probabilities based on minimal path concepts. The identified indices provide system engineers with additional information on the degree of system well-being in the form of system health and margin state probabilities. A basic planning objective should be to design a system such that the probabilities of the health and risk states are acceptable. The application of the technique is illustrated in this paper using a relatively simple network

  10. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  11. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    KAUST Repository

    Harrou, Fouzi

    2017-09-18

    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.

  12. A reliable approach to the closure of large acquired midline defects of the back

    International Nuclear Information System (INIS)

    Casas, L.A.; Lewis, V.L. Jr.

    1989-01-01

    A systematic regionalized approach for the reconstruction of acquired thoracic and lumbar midline defects of the back is described. Twenty-three patients with wounds resulting from pressure necrosis, radiation injury, and postoperative wound infection and dehiscence were successfully reconstructed. The latissimus dorsi, trapezius, gluteus maximus, and paraspinous muscles are utilized individually or in combination as advancement, rotation, island, unipedicle, turnover, or bipedicle flaps. All flaps are designed so that their vascular pedicles are out of the field of injury. After thorough debridement, large, deep wounds are closed with two layers of muscle, while smaller, more superficial wounds are reconstructed with one layer. The trapezius muscle is utilized in the high thoracic area for the deep wound layer, while the paraspinous muscle is used for this layer in the thoracic and lumbar regions. Superficial layer and small wounds in the high thoracic area are reconstructed with either latissimus dorsi or trapezius muscle. Corresponding wounds in the thoracic and lumbar areas are closed with latissimus dorsi muscle alone or in combination with gluteus maximus muscle. The rationale for systematic regionalized reconstruction of acquired midline back wounds is described

  13. Development and reliability testing of a Health Action Process Approach inventory for physical activity participation among individuals with schizophrenia

    Directory of Open Access Journals (Sweden)

    Kelly eArbour-Nicitopoulos

    2014-06-01

    Full Text Available Individuals with schizophrenia tend to have high levels of cardiovascular disease and lower physical activity (PA levels than the general population. Research is urgently required in developing evidence-based behavioral interventions for increasing PA in this population. One model that has been increasingly used to understand the mechanisms underlying PA is the Health Action Process Approach (HAPA. The purpose of this study was to adapt and pilot-test a HAPA-based inventory that reliably captures salient, modifiable PA determinants for individuals with schizophrenia. Initially, twelve outpatients with schizophrenia reviewed the inventory and provided verbal feedback regarding comprehension, item relevance, and potential new content. A content analysis framework was used to inform modifications to the inventory. The resultant inventory underwent a quantitative assessment of internal consistency and test-retest reliability. Twenty-five outpatients (Mage= 41.5 ± 13.5 years; 64% male completed the inventory on two separate occasions, one week apart. All but two scales showed good internal consistency (Cronbach’s α=0.62–0.98 and test-retest correlations (rs = .21-.96. Preliminary assessment of criterion validity of the HAPA inventory showed significant, large-sized correlations between behavioural intentions and both affective outcome expectancies and task self-efficacy, and small-to-moderate correlations between self-reported minutes of moderate-to-vigorous PA and the volitional constructs of the HAPA model. These findings provide preliminary support for the reliability and validity of the first-ever inventory for examining theory-based predictors of moderate to vigorous PA intentions and behavior among individuals with schizophrenia. Further validation research with this inventory using an objective measure of PA behavior will provide additional support for its psychometric properties within the schizophrenia population.

  14. A Step by Step Approach for Evaluating the Reliability of the Main Engine Lube Oil System for a Ship's Propulsion System

    Directory of Open Access Journals (Sweden)

    Mohan Anantharaman

    2014-09-01

    Full Text Available Effective and efficient maintenance is essential to ensure reliability of a ship's main propulsion system, which in turn is interdependent on the reliability of a number of associated sub- systems. A primary step in evaluating the reliability of the ship's propulsion system will be to evaluate the reliability of each of the sub- system. This paper discusses the methodology adopted to quantify reliability of one of the vital sub-system viz. the lubricating oil system, and development of a model, based on Markov analysis thereof. Having developed the model, means to improve reliability of the system should be considered. The cost of the incremental reliability should be measured to evaluate cost benefits. A maintenance plan can then be devised to achieve the higher level of reliability. Similar approach could be considered to evaluate the reliability of all other sub-systems. This will finally lead to development of a model to evaluate and improve the reliability of the main propulsion system.

  15. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  16. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Science.gov (United States)

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew

    2016-01-01

    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  17. A Type-2 fuzzy data fusion approach for building reliable weighted protein interaction networks with application in protein complex detection.

    Science.gov (United States)

    Mehranfar, Adele; Ghadiri, Nasser; Kouhsar, Morteza; Golshani, Ashkan

    2017-09-01

    Detecting the protein complexes is an important task in analyzing the protein interaction networks. Although many algorithms predict protein complexes in different ways, surveys on the interaction networks indicate that about 50% of detected interactions are false positives. Consequently, the accuracy of existing methods needs to be improved. In this paper we propose a novel algorithm to detect the protein complexes in 'noisy' protein interaction data. First, we integrate several biological data sources to determine the reliability of each interaction and determine more accurate weights for the interactions. A data fusion component is used for this step, based on the interval type-2 fuzzy voter that provides an efficient combination of the information sources. This fusion component detects the errors and diminishes their effect on the detection protein complexes. So in the first step, the reliability scores have been assigned for every interaction in the network. In the second step, we have proposed a general protein complex detection algorithm by exploiting and adopting the strong points of other algorithms and existing hypotheses regarding real complexes. Finally, the proposed method has been applied for the yeast interaction datasets for predicting the interactions. The results show that our framework has a better performance regarding precision and F-measure than the existing approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  19. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    Directory of Open Access Journals (Sweden)

    Raj Kumar Chopra

    2016-09-01

    Full Text Available Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analyze the accuracy of individual approaches and the variation of accuracy with the complexity of the software project. The results indicate that selecting non functional requirements separately, but in accordance with functionality has higher accuracy amongst the other two approaches. Further, likewise other approaches, it witnesses the decrease in accuracy with increase in software complexity but the decrease is minimal.

  20. Química geral experimental: uma nova abordagem didática Experimental general chemistry: a new teaching approach

    Directory of Open Access Journals (Sweden)

    Geraldo Eduardo da Luz Júnior

    2004-02-01

    Full Text Available This essay describes a new didactic approach, in according with the national curriculum guidelines for chemistry undergraduate courses in Brazil, employed during the one-semester course "Experimental General Chemistry" for chemistry undergraduate students at the Federal University of Piauí. The new approach has positively helped student's training by improving their reading skills and their understanding of scientific reports, by developing the use of electronic tools to search and to recover the required knowledge for their learning activities, and by improving their skills of understanding published texts and dealing with digital sources. At the same time the students are strongly stimulated to enter the research program for undergraduate students available at the University.

  1. A practical approach for solving multi-objective reliability redundancy allocation problems using extended bare-bones particle swarm optimization

    International Nuclear Information System (INIS)

    Zhang, Enze; Wu, Yifei; Chen, Qingwei

    2014-01-01

    This paper proposes a practical approach, combining bare-bones particle swarm optimization and sensitivity-based clustering for solving multi-objective reliability redundancy allocation problems (RAPs). A two-stage process is performed to identify promising solutions. Specifically, a new bare-bones multi-objective particle swarm optimization algorithm (BBMOPSO) is developed and applied in the first stage to identify a Pareto-optimal set. This algorithm mainly differs from other multi-objective particle swarm optimization algorithms in the parameter-free particle updating strategy, which is especially suitable for handling the complexity and nonlinearity of RAPs. Moreover, by utilizing an approach based on the adaptive grid to update the global particle leaders, a mutation operator to improve the exploration ability and an effective constraint handling strategy, the integrated BBMOPSO algorithm can generate excellent approximation of the true Pareto-optimal front for RAPs. This is followed by a data clustering technique based on difference sensitivity in the second stage to prune the obtained Pareto-optimal set and obtain a small, workable sized set of promising solutions for system implementation. Two illustrative examples are presented to show the feasibility and effectiveness of the proposed approach

  2. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  3. A novel approach to the experimental study on methane/steam reforming kinetics using the Orthogonal Least Squares method

    Science.gov (United States)

    Sciazko, Anna; Komatsu, Yosuke; Brus, Grzegorz; Kimijima, Shinji; Szmyd, Janusz S.

    2014-09-01

    For a mathematical model based on the result of physical measurements, it becomes possible to determine their influence on the final solution and its accuracy. However, in classical approaches, the influence of different model simplifications on the reliability of the obtained results are usually not comprehensively discussed. This paper presents a novel approach to the study of methane/steam reforming kinetics based on an advanced methodology called the Orthogonal Least Squares method. The kinetics of the reforming process published earlier are divergent among themselves. To obtain the most probable values of kinetic parameters and enable direct and objective model verification, an appropriate calculation procedure needs to be proposed. The applied Generalized Least Squares (GLS) method includes all the experimental results into the mathematical model which becomes internally contradicted, as the number of equations is greater than number of unknown variables. The GLS method is adopted to select the most probable values of results and simultaneously determine the uncertainty coupled with all the variables in the system. In this paper, the evaluation of the reaction rate after the pre-determination of the reaction rate, which was made by preliminary calculation based on the obtained experimental results over a Nickel/Yttria-stabilized Zirconia catalyst, was performed.

  4. Fracture in quasi-brittle materials: experimental and numerical approach for the determination of an incremental model with generalized variables

    International Nuclear Information System (INIS)

    Morice, Erwan

    2014-01-01

    Fracture in quasi-brittle materials, such as ceramics or concrete, can be represented schematically by series of events of nucleation and coalescence of micro-cracks. Modeling this process is an important challenge for the reliability and life prediction of concrete structures, in particular the prediction of the permeability of damaged structures. A multi-scale approach is proposed. The global behavior is modeled within the fracture mechanics framework and the local behavior is modeled by the discrete element method. An approach was developed to condense the non linear behavior of the mortar. A model reduction technic is used to extract the relevant information from the discrete elements method. To do so, the velocity field is partitioned into mode I, II, linear and non-linear components, each component being characterized by an intensity factor and a fixed spatial distribution. The response of the material is hence condensed in the evolution of the intensity factors, used as non-local variables. A model was also proposed to predict the behavior of the crack for proportional and non-proportional mixed mode I+II loadings. An experimental campaign was finally conducted to characterize the fatigue and fracture behavior of mortar. The results show that fatigue crack growth can be of significant importance. The experimental velocity field determined, in the crack tip region, by DIC, were analyzed using the same technic as that used for analyzing the fields obtained by the discrete element method showing consistent results. (author)

  5. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  6. Reliability and validity of neurobehavioral function on the Psychology Experimental Building Language test battery in young adults

    Directory of Open Access Journals (Sweden)

    Brian J. Piper

    2015-12-01

    Full Text Available Background. The Psychology Experiment Building Language (PEBL software consists of over one-hundred computerized tests based on classic and novel cognitive neuropsychology and behavioral neurology measures. Although the PEBL tests are becoming more widely utilized, there is currently very limited information about the psychometric properties of these measures.Methods. Study I examined inter-relationships among nine PEBL tests including indices of motor-function (Pursuit Rotor and Dexterity, attention (Test of Attentional Vigilance and Time-Wall, working memory (Digit Span Forward, and executive-function (PEBL Trail Making Test, Berg/Wisconsin Card Sorting Test, Iowa Gambling Test, and Mental Rotation in a normative sample (N = 189, ages 18–22. Study II evaluated test–retest reliability with a two-week interest interval between administrations in a separate sample (N = 79, ages 18–22.Results. Moderate intra-test, but low inter-test, correlations were observed and ceiling/floor effects were uncommon. Sex differences were identified on the Pursuit Rotor (Cohen’s d = 0.89 and Mental Rotation (d = 0.31 tests. The correlation between the test and retest was high for tests of motor learning (Pursuit Rotor time on target r = .86 and attention (Test of Attentional Vigilance response time r = .79, intermediate for memory (digit span r = .63 but lower for the executive function indices (Wisconsin/Berg Card Sorting Test perseverative errors = .45, Tower of London moves = .15. Significant practice effects were identified on several indices of executive function.Conclusions. These results are broadly supportive of the reliability and validity of individual PEBL tests in this sample. These findings indicate that the freely downloadable, open-source PEBL battery (http://pebl.sourceforge.net is a versatile research tool to study individual differences in neurocognitive performance.

  7. Reliability and validity of neurobehavioral function on the Psychology Experimental Building Language test battery in young adults.

    Science.gov (United States)

    Piper, Brian J; Mueller, Shane T; Geerken, Alexander R; Dixon, Kyle L; Kroliczak, Gregory; Olsen, Reid H J; Miller, Jeremy K

    2015-01-01

    Background. The Psychology Experiment Building Language (PEBL) software consists of over one-hundred computerized tests based on classic and novel cognitive neuropsychology and behavioral neurology measures. Although the PEBL tests are becoming more widely utilized, there is currently very limited information about the psychometric properties of these measures. Methods. Study I examined inter-relationships among nine PEBL tests including indices of motor-function (Pursuit Rotor and Dexterity), attention (Test of Attentional Vigilance and Time-Wall), working memory (Digit Span Forward), and executive-function (PEBL Trail Making Test, Berg/Wisconsin Card Sorting Test, Iowa Gambling Test, and Mental Rotation) in a normative sample (N = 189, ages 18-22). Study II evaluated test-retest reliability with a two-week interest interval between administrations in a separate sample (N = 79, ages 18-22). Results. Moderate intra-test, but low inter-test, correlations were observed and ceiling/floor effects were uncommon. Sex differences were identified on the Pursuit Rotor (Cohen's d = 0.89) and Mental Rotation (d = 0.31) tests. The correlation between the test and retest was high for tests of motor learning (Pursuit Rotor time on target r = .86) and attention (Test of Attentional Vigilance response time r = .79), intermediate for memory (digit span r = .63) but lower for the executive function indices (Wisconsin/Berg Card Sorting Test perseverative errors = .45, Tower of London moves = .15). Significant practice effects were identified on several indices of executive function. Conclusions. These results are broadly supportive of the reliability and validity of individual PEBL tests in this sample. These findings indicate that the freely downloadable, open-source PEBL battery (http://pebl.sourceforge.net) is a versatile research tool to study individual differences in neurocognitive performance.

  8. Discussion about the use of the volume specific surface area (VSSA) as a criterion to identify nanomaterials according to the EU definition. Part two: experimental approach.

    Science.gov (United States)

    Lecloux, André J; Atluri, Rambabu; Kolen'ko, Yury V; Deepak, Francis Leonard

    2017-10-12

    The first part of this study was dedicated to the modelling of the influence of particle shape, porosity and particle size distribution on the volume specific surface area (VSSA) values in order to check the applicability of this concept to the identification of nanomaterials according to the European Commission Recommendation. In this second part, experimental VSSA values are obtained for various samples from nitrogen adsorption isotherms and these values were used as a screening tool to identify and classify nanomaterials. These identification results are compared to the identification based on the 50% of particles with a size below 100 nm criterion applied to the experimental particle size distributions obtained by analysis of electron microscopy images on the same materials. It is concluded that the experimental VSSA values are able to identify nanomaterials, without false negative identification, if they have a mono-modal particle size, if the adsorption data cover the relative pressure range from 0.001 to 0.65 and if a simple, qualitative image of the particles by transmission or scanning electron microscopy is available to define their shape. The experimental conditions to obtain reliable adsorption data as well as the way to analyze the adsorption isotherms are described and discussed in some detail in order to help the reader in using the experimental VSSA criterion. To obtain the experimental VSSA values, the BET surface area can be used for non-porous particles, but for porous, nanostructured or coated nanoparticles, only the external surface of the particles, obtained by a modified t-plot approach, should be considered to determine the experimental VSSA and to avoid false positive identification of nanomaterials, only the external surface area being related to the particle size. Finally, the availability of experimental VSSA values together with particle size distributions obtained by electron microscopy gave the opportunity to check the

  9. Electronic properties of Fe charge transfer complexes – A combined experimental and theoretical approach

    International Nuclear Information System (INIS)

    Ferreira, Hendrik; Eschwege, Karel G. von; Conradie, Jeanet

    2016-01-01

    Highlights: • Experimental and computational study of Fe II -phen, -bpy & -tpy compleesx. • Close correlations between experimental redox and spectral, and computational data. • Computational methods fast-track DSSC research. - Abstract: Dye-sensitized solar cell technology holds huge potential in renewable electricity generation of the future. Due to demand urgency, ways need to be explored to reduce research time and cost. Against this background, quantum computational chemistry is illustrated to be a reliable tool at the onset of studies in this field, simulating charge transfer, spectral (solar energy absorbed) and electrochemical (ease by which electrons may be liberated) tuning of related photo-responsive dyes. Comparative experimental and theoretical DFT studies were done under similar conditions, involving an extended series of electrochemically altered phenanthrolines, bipyridyl and terpyridyl complexes of Fe II . Fe II/III oxidation waves vary from 0.363 V for tris(3,6-dimethoxybipyridyl)Fe II to 0.894 V (versus Fc/Fc + ) for the 5-nitrophenanthroline complex. Theoretical DFT computed ionization potentials in the bipyridyl sub-series achieved an almost 100% linear correlation with experimental electrochemical oxidation potentials, while the phenanthroline sub-series gave R 2 = 0.95. Apart from the terpyridyl complex which accorded an almost perfect match, in general, TDDFT oscillators were computed at slightly lower energies than what was observed experimentally, while molecular HOMO and LUMO renderings reveal desired complexes with directional charge transfer propensities.

  10. AN ANALYTICAL FRAMEWORK FOR ASSESSING RELIABLE NUCLEAR FUEL SERVICE APPROACHES: ECONOMIC AND NON-PROLIFERATION MERITS OF NUCLEAR FUEL LEASING

    International Nuclear Information System (INIS)

    Kreyling, Sean J.; Brothers, Alan J.; Short, Steven M.; Phillips, Jon R.; Weimar, Mark R.

    2010-01-01

    The goal of international nuclear policy since the dawn of nuclear power has been the peaceful expansion of nuclear energy while controlling the spread of enrichment and reprocessing technology. Numerous initiatives undertaken in the intervening decades to develop international agreements on providing nuclear fuel supply assurances, or reliable nuclear fuel services (RNFS) attempted to control the spread of sensitive nuclear materials and technology. In order to inform the international debate and the development of government policy, PNNL has been developing an analytical framework to holistically evaluate the economics and non-proliferation merits of alternative approaches to managing the nuclear fuel cycle (i.e., cradle-to-grave). This paper provides an overview of the analytical framework and discusses preliminary results of an economic assessment of one RNFS approach: full-service nuclear fuel leasing. The specific focus of this paper is the metrics under development to systematically evaluate the non-proliferation merits of fuel-cycle management alternatives. Also discussed is the utility of an integrated assessment of the economics and non-proliferation merits of nuclear fuel leasing.

  11. Some study on radiation resistance and reliability of piston ring of waste gas compressor for fast breeder experimental reactor

    International Nuclear Information System (INIS)

    Muramatsu, Takio; Hidaka, Tsukasa

    1976-01-01

    In the fast breeder experimental reactor ''Joyo'', the gaseous wastes such as reactor cover argon, reactor seal nitrogen gas, fuel handling waste gas etc. shall be collected, compressed and storaged for decaying their activity. Compressors applied in the above process have new type oilless piston rings of Teflon filled with graphite, which might be affected by radioactivity of the waste gases. This report deals with some study on the gamma iradiation effects on the plastic piston rings such as tensile strength, elongation, shock and hardness effects under several irradiation doses and on durability test of the irradiated piston rings under the same compression ratio. (auth.)

  12. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach

    Directory of Open Access Journals (Sweden)

    Thomas Frigeri

    2014-10-01

    Full Text Available Objective To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH for treatment of mesial temporal lobe epilepsy (MTLE. Method The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. Conclusion The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  13. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach.

    Science.gov (United States)

    Frigeri, Thomas; Rhoton, Albert; Paglioli, Eliseu; Azambuja, Ney

    2014-10-01

    To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP) and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH) for treatment of mesial temporal lobe epilepsy (MTLE). The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  14. A semantic web approach applied to integrative bioinformatics experimentation: a biological use case with genomics data.

    NARCIS (Netherlands)

    Post, L.J.G.; Roos, M.; Marshall, M.S.; van Driel, R.; Breit, T.M.

    2007-01-01

    The numerous public data resources make integrative bioinformatics experimentation increasingly important in life sciences research. However, it is severely hampered by the way the data and information are made available. The semantic web approach enhances data exchange and integration by providing

  15. An Experimental Approach to the Joint Effects of Relations with Partner, Friends and Parents on Happiness

    Science.gov (United States)

    Theuns, P.; Verresen, N.; Mairesse, O.; Goossens, R.; Michiels, L.; Peeters, E.; Wastiau, M.

    2010-01-01

    Personal relations constitute an important life domain and satisfaction therein affects happiness in people. In an experimental approach with a 3x3x3 vignettes study in which 103 first year psychology students participated, the contribution of the quality of relationships with parents, friends, and a partner are studied. It is found that the…

  16. Experimental semiotics: a new approach for studying communication as a form of joint action.

    Science.gov (United States)

    Galantucci, Bruno

    2009-04-01

    In the last few years, researchers have begun to investigate the emergence of novel forms of human communication in the laboratory. I survey this growing line of research, which may be called experimental semiotics, from three distinct angles. First, I situate the new approach in its theoretical and historical context. Second, I review a sample of studies that exemplify experimental semiotics. Third, I present an empirical study that illustrates how the new approach can help us understand the socio-cognitive underpinnings of human communication. The main conclusion of the paper will be that, by reproducing micro samples of historical processes in the laboratory, experimental semiotics offers new powerful tools for investigating human communication as a form of joint action. Copyright © 2009 Cognitive Science Society, Inc.

  17. Reliability of transpulmonary pressure-time curve profile to identify tidal recruitment/hyperinflation in experimental unilateral pleural effusion.

    Science.gov (United States)

    Formenti, P; Umbrello, M; Graf, J; Adams, A B; Dries, D J; Marini, J J

    2017-08-01

    The stress index (SI) is a parameter that characterizes the shape of the airway pressure-time profile (P/t). It indicates the slope progression of the curve, reflecting both lung and chest wall properties. The presence of pleural effusion alters the mechanical properties of the respiratory system decreasing transpulmonary pressure (Ptp). We investigated whether the SI computed using Ptp tracing would provide reliable insight into tidal recruitment/overdistention during the tidal cycle in the presence of unilateral effusion. Unilateral pleural effusion was simulated in anesthetized, mechanically ventilated pigs. Respiratory system mechanics and thoracic computed tomography (CT) were studied to assess P/t curve shape and changes in global lung aeration. SI derived from airway pressure (Paw) was compared with that calculated by Ptp under the same conditions. These results were themselves compared with quantitative CT analysis as a gold standard for tidal recruitment/hyperinflation. Despite marked changes in tidal recruitment, mean values of SI computed either from Paw or Ptp were remarkably insensitive to variations of PEEP or condition. After the instillation of effusion, SI indicates a preponderant over-distension effect, not detected by CT. After the increment in PEEP level, the extent of CT-determined tidal recruitment suggest a huge recruitment effect of PEEP as reflected by lung compliance. Both SI in this case were unaffected. We showed that the ability of SI to predict tidal recruitment and overdistension was significantly reduced in a model of altered chest wall-lung relationship, even if the parameter was computed from the Ptp curve profile.

  18. Experimental aspects of buoyancy correction in measuring reliable high-pressure excess adsorption isotherms using the gravimetric method

    Science.gov (United States)

    Nguyen, Huong Giang T.; Horn, Jarod C.; Thommes, Matthias; van Zee, Roger D.; Espinal, Laura

    2017-12-01

    Addressing reproducibility issues in adsorption measurements is critical to accelerating the path to discovery of new industrial adsorbents and to understanding adsorption processes. A National Institute of Standards and Technology Reference Material, RM 8852 (ammonium ZSM-5 zeolite), and two gravimetric instruments with asymmetric two-beam balances were used to measure high-pressure adsorption isotherms. This work demonstrates how common approaches to buoyancy correction, a key factor in obtaining the mass change due to surface excess gas uptake from the apparent mass change, can impact the adsorption isotherm data. Three different approaches to buoyancy correction were investigated and applied to the subcritical CO2 and supercritical N2 adsorption isotherms at 293 K. It was observed that measuring a collective volume for all balance components for the buoyancy correction (helium method) introduces an inherent bias in temperature partition when there is a temperature gradient (i.e. analysis temperature is not equal to instrument air bath temperature). We demonstrate that a blank subtraction is effective in mitigating the biases associated with temperature partitioning, instrument calibration, and the determined volumes of the balance components. In general, the manual and subtraction methods allow for better treatment of the temperature gradient during buoyancy correction. From the study, best practices specific to asymmetric two-beam balances and more general recommendations for measuring isotherms far from critical temperatures using gravimetric instruments are offered.

  19. A simple and reliable approach to docking protein-protein complexes from very sparse NOE-derived intermolecular distance restraints

    International Nuclear Information System (INIS)

    Tang, Chun; Clore, G. Marius

    2006-01-01

    A simple and reliable approach for docking protein-protein complexes from very sparse NOE-derived intermolecular distance restraints (as few as three from a single point) in combination with a novel representation for an attractive potential between mapped interaction surfaces is described. Unambiguous assignments of very sparse intermolecular NOEs are obtained using a reverse labeling strategy in which one the components is fully deuterated with the exception of selective protonation of the δ-methyl groups of isoleucine, while the other component is uniformly 13 C-labeled. This labeling strategy can be readily extended to selective protonation of Ala, Leu, Val or Met. The attractive potential is described by a 'reduced' radius of gyration potential applied specifically to a subset of interfacial residues (those with an accessible surface area ≥ 50% in the free proteins) that have been delineated by chemical shift perturbation. Docking is achieved by rigid body minimization on the basis of a target function comprising the sparse NOE distance restraints, a van der Waals repulsion potential and the 'reduced' radius of gyration potential. The method is demonstrated for two protein-protein complexes (EIN-HPr and IIA Glc -HPr) from the bacterial phosphotransferase system. In both cases, starting from 100 different random orientations of the X-ray structures of the free proteins, 100% convergence is achieved to a single cluster (with near identical atomic positions) with an overall backbone accuracy of ∼2 A. The approach described is not limited to NMR, since interfaces can also be mapped by alanine scanning mutagenesis, and sparse intermolecular distance restraints can be derived from double cycle mutagenesis, cross-linking combined with mass spectrometry, or fluorescence energy transfer

  20. A simple and reliable approach to docking protein-protein complexes from very sparse NOE-derived intermolecular distance restraints

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Chun; Clore, G. Marius [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)], E-mail: mariusc@intra.niddk.nih.gov

    2006-09-15

    A simple and reliable approach for docking protein-protein complexes from very sparse NOE-derived intermolecular distance restraints (as few as three from a single point) in combination with a novel representation for an attractive potential between mapped interaction surfaces is described. Unambiguous assignments of very sparse intermolecular NOEs are obtained using a reverse labeling strategy in which one the components is fully deuterated with the exception of selective protonation of the {delta}-methyl groups of isoleucine, while the other component is uniformly {sup 13}C-labeled. This labeling strategy can be readily extended to selective protonation of Ala, Leu, Val or Met. The attractive potential is described by a 'reduced' radius of gyration potential applied specifically to a subset of interfacial residues (those with an accessible surface area {>=} 50% in the free proteins) that have been delineated by chemical shift perturbation. Docking is achieved by rigid body minimization on the basis of a target function comprising the sparse NOE distance restraints, a van der Waals repulsion potential and the 'reduced' radius of gyration potential. The method is demonstrated for two protein-protein complexes (EIN-HPr and IIA{sup Glc}-HPr) from the bacterial phosphotransferase system. In both cases, starting from 100 different random orientations of the X-ray structures of the free proteins, 100% convergence is achieved to a single cluster (with near identical atomic positions) with an overall backbone accuracy of {approx}2 A. The approach described is not limited to NMR, since interfaces can also be mapped by alanine scanning mutagenesis, and sparse intermolecular distance restraints can be derived from double cycle mutagenesis, cross-linking combined with mass spectrometry, or fluorescence energy transfer.

  1. Improving the reliability of POD curves in NDI methods using a Bayesian inversion approach for uncertainty quantification

    Science.gov (United States)

    Ben Abdessalem, A.; Jenson, F.; Calmon, P.

    2016-02-01

    This contribution provides an example of the possible advantages of adopting a Bayesian inversion approach to uncertainty quantification in nondestructive inspection methods. In such problem, the uncertainty associated to the random parameters is not always known and needs to be characterised from scattering signal measurements. The uncertainties may then correctly propagated in order to determine a reliable probability of detection curve. To this end, we establish a general Bayesian framework based on a non-parametric maximum likelihood function formulation and some priors from expert knowledge. However, the presented inverse problem is time-consuming and computationally intensive. To cope with this difficulty, we replace the real model by a surrogate one in order to speed-up the model evaluation and to make the problem to be computationally feasible for implementation. The least squares support vector regression is adopted as metamodelling technique due to its robustness to deal with non-linear problems. We illustrate the usefulness of this methodology through the control of tube with enclosed defect using ultrasonic inspection method.

  2. Human reliability analysis approach to level 1 PSA - shutdown and low power operation of Mochovce NPP, Unit 1, Slovakia

    International Nuclear Information System (INIS)

    Stojka, Tibor; Holy, Jaroslav

    2003-01-01

    The paper presents general approach, used methods and form of documentation of the results as have been applied within the Human Reliability Analysis (HRA) task of the shutdown and low power PSA (SPSA) study for Mochovce nuclear power plant, Unit 1, Slovakia. The paper describes main goals of the HRA task within the SPSA project, applied methods and data sources. Basic steps of the HRA task and human errors (HE) classification are also specified in its first part. The main part of the paper deals with pre-initiator human errors, human-induced initiators and response to initiator human errors. Since the expert judgment method (SLIM) was used for the last type of human errors probability assessment, also related activities are described including preparation works (performance shaping factors (PSFs) selection, development of PSF classification tables, preparation of aid tools for interview with plant experts), qualitative analysis (sources of information and basic steps) and quantitative analysis itself (human errors classification for final quantification including criteria used for the classification, description of structure of the spreadsheet used for quantification and treatment with dependencies). The last part of the paper describes form of documentation of the final results and provides some findings. (author)

  3. Port-of-entry safety via the reliability optimization of container inspection strategy through an evolutionary approach

    International Nuclear Information System (INIS)

    Ramirez-Marquez, Jose Emmanuel

    2008-01-01

    Up to now, of all the containers received in USA ports, roughly between 2% and 5% are scrutinized to determine if they could cause some type of danger or contain suspicious goods. Recently, concerns have been raised regarding the type of attack that could happen via container cargo leading to devastating economic, psychological and sociological effects. Overall, this paper is concerned with developing an inspection strategy that minimizes the total cost of inspection while maintaining a user-specified detection rate for 'suspicious' containers. In this respect, a general model for describing an inspection strategy is proposed. The strategy is regarded as an (n+1)-echelon decision tree where at each of these echelons, a decision has to be taken, regarding which sensor to be used, if at all. Second, based on the general decision-tree model, this paper presents a minimum cost container inspection strategy that conforms to a pre-specified user detection rate under the assumption that different sensors with different reliability and cost characteristics can be used. To generate an optimal inspection strategy, an evolutionary optimization approach known as probabilistic solution discovery algorithm has been used

  4. An innovative approach for planning and execution of pre-experimental runs for Design of Experiments

    Directory of Open Access Journals (Sweden)

    Muhammad Arsalan Farooq

    2016-09-01

    Full Text Available This paper addresses the study of the pre-experimental planning phase of the Design of Experiments (DoE in order to improve the final product quality. The pre-experimental planning phase includes a clear identification of the problem statement, selection of control factors and their respective levels and ranges. To improve production quality based on the DoE a new approach for the pre-experimental planning phase, called Non-Conformity Matrix (NCM, is presented. This article also addresses the key steps of the pre-experimental runs considering a consumer goods manufacturing process. Results of the application for an industrial case show that this methodology can support a clear definition of the problem and also a correct identification of the factor ranges in particular situations. The proposed new approach allows modeling the entire manufacturing system holistically and correctly defining the factor ranges and respective levels for a more effective application of DoE. This new approach can be a useful resource for both research and industrial practitioners who are dedicated to large DoE projects with unknown factor interactions, when the operational levels and ranges are not completely defined.

  5. Study types and reliability of Real World Evidence compared with experimental evidence used in Polish reimbursement decision-making processes.

    Science.gov (United States)

    Wilk, N; Wierzbicka, N; Skrzekowska-Baran, I; Moćko, P; Tomassy, J; Kloc, K

    2017-04-01

    The aim of this study was to identify the relationship and impact between Real World Evidence (RWE) and experimental evidence (EE) in Polish decision-making processes for the drugs from selected Anatomical Therapeutic Chemical (ATC) groups. Descriptive study. A detailed analysis was performed for 58 processes from five ATC code groups in which RWE for effectiveness, or effectiveness and safety were cited in Agency for Health Technology Assessment and Tariff System's (AOTMiT) documents published between January 2012 and September 2015: Verification Analysis of AOTMiT, Statement of the Transparency Council of AOTMiT, and Recommendation of the President of AOTMiT. In 62% of the cases, RWE supported the EE and confirmed its main conclusions. The majority of studies in the EE group showed to be RCTs (97%), and the RWE group included mainly cohort studies (89%). There were more studies without a control group within RWE compared with the EE group (10% vs 1%). Our results showed that EE are more often assessed using Jadad, NICE or NOS scale by AOTMiT compared with RWE (93% vs 48%). When the best evidence within a given decision-making process is analysed, half of RWE and two-thirds of EE are considered high quality evidence. RWE plays an important role in the decision-making processes on public funding of drugs in Poland, contributing to nearly half (45%) of all the evidence considered. There exist such processes in which the proportion of RWE is dominant, with one process showing RWE as the only evidence presented. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  6. WWER-440 reactor thermal power increase. Up-to-date approaches to substantiation of the core heat-engineering reliability

    International Nuclear Information System (INIS)

    Vasilchenko, I.; Lushin, V.; Zubtsov, D.

    2006-01-01

    Increasing the Units power is an urgent problem for nuclear power plants with WWER-440 reactors. Improving the fuel assembly designs and calculated codes creates all prerequisites to fulfil this purpose. The decrease in the core power peaking is reached by using the profiled fuel assemblies, burnable absorber integrated into the fuel, the FA with the modernized interface attachment, modern calculated codes that allows to reduce conservatism of the RP safety substantiation. A wide spectrum of experimental study of behaviour of the fuel having reached burn-up (50-60) MW days / kg U under the transients and accident conditions was carried out, the post-irradiated examination of the fuel assemblies, fuel rods and fuel pellets with four and five annual operating fuel cycle were performed as well and confirmed the high reliability of the fuel, the presence of large margins of the fuel stack state that contributes to reactor operation at the increased power. The results of the carried out experiments on implementing the five and six annual fuel cycles show that the limiting fuel state as to its serviceability in the WWER-440 reactors is far from being reached. Presently there is an experience of the increased power operation of Kola NPP, Units 1, 2, 4 and Rovno NPP, Unit 2. The Loviisa NPP Units are operated at 109 % power. The Russian experts had gained an experience in substantiating the core operation at 108 % power for Paks NPP, Unit 4. In this paper the additional conditions for increasing the power of the Kola NPP, Units 1 and 2 and the main results of substantiation of increase in power of the Paks NPP, Unit 4 up to 1485 MW are presented in details

  7. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  8. Interaction of CREDO [Centralized Reliability Data Organization] with the EBR-II [Experimental Breeder Reactor II] PRA [probabilistic risk assessment] development

    International Nuclear Information System (INIS)

    Smith, M.S.; Ragland, W.A.

    1989-01-01

    The National Academy of Sciences review of US Department of Energy (DOE) class 1 reactors recommended that the Experimental Breeder Reactor II (EBR-II), operated by Argonne National Laboratory (ANL), develop a level 1 probabilistic risk assessment (PRA) and make provisions for level 2 and level 3 PRAs based on the results of the level 1 PRA. The PRA analysis group at ANL will utilize the Centralized Reliability Data Organization (CREDO) at Oak Ridge National Laboratory to support the PRA data needs. CREDO contains many years of empirical liquid-metal reactor component data from EBR-II. CREDO is a mutual data- and cost-sharing system sponsored by DOE and the Power Reactor and Nuclear Fuels Development Corporation of Japan. CREDO is a component based data system; data are collected on components that are liquid-metal specific, associated with a liquid-metal environment, contained in systems that interface with liquid-metal environments, or are safety related for use in reliability/availability/maintainability (RAM) analyses of advanced reactors. The links between the EBR-II PRA development effort and the CREDO data collection at EBR-II extend beyond the sharing of data. The PRA provides a measure of the relative contribution to risk of the various components. This information can be used to prioritize future CREDO data collection activities at EBR-II and other sites

  9. Study of Photovoltaic Energy Storage by Supercapacitors through Both Experimental and Modelling Approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Olivier Logerais

    2013-01-01

    Full Text Available The storage of photovoltaic energy by supercapacitors is studied by using two approaches. An overview on the integration of supercapacitors in solar energy conversion systems is previously provided. First, a realized experimental setup of charge/discharge of supercapacitors fed by a photovoltaic array has been operated with fine data acquisition. The second approach consists in simulating photovoltaic energy storage by supercapacitors with a faithful and accessible model composed of solar irradiance evaluation, equivalent electrical circuit for photovoltaic conversion, and a multibranch circuit for supercapacitor. Both the experimental and calculated results are confronted, and an error of 1% on the stored energy is found with a correction largely within ±10% of the transmission line capacitance according to temperature.

  10. Comparing Conventional Bank Credit Vis A Vis Shariah Bank Musharakah: Experimental Economic Approach

    Directory of Open Access Journals (Sweden)

    Muhamad Abduh

    2008-01-01

    Full Text Available Central Bank of Indonesia with dual banking system – i.e Shariah and Conventional Bank – keep on developing system that considered as an answer to generate the national economic growth. One of the banking activities that emphasized by the Central Bank of Indonesia is fund distribution through either conventional bank credit or shariah bank fi nancing. Having the Experimental Economic Approach based on Induced Value Theory and employing ANOVA, this paper found that shariah bank musharakah fi nancing system would come up with higher profi t opportunity compare to conventional credit system. One main reason is that musharakah fi nancing in shariah bank applies profi t and lost sharing (PLS scheme so that will not be a burden to the customer when he fi nd low profi t.Keywords: Credit Loan, Musharakah Financing, Induced Value Theory, Experimental Economic Approach, Analysis of Variance (ANOVA.

  11. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  12. Análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales//Reliability and risk analysis of an experimental set-up for wastewater treatment

    Directory of Open Access Journals (Sweden)

    María Adelfa Abreu‐Zamora

    2014-01-01

    Full Text Available Una de las exigencias modernas para el uso de equipos en todas las ramas de la economía, la ciencia y la educación, es su explotación segura. En este trabajo, se realizó el análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales con radiación ultravioleta. Se empleó la técnica del árbol de fallos y se analizaron dos variantes de cálculo. La primera variante consideró fuentes no confiables de suministro de energía eléctrica y la segunda consideró la existencia de fuentes confiables. Como resultado se identificaron 20 conjuntos mínimos de corte, 12 de primer orden y 8 de tercer orden. Además, se infirió, la necesidad de contar con una fuente alternativa de electricidad y que esimportante establecer redundancia de grupo de componentes para instalaciones a escala industrial. El análisis demostró que la instalación es segura para su uso en la experimentación a escala de laboratorio.Palabras claves: confiabilidad, riesgo, instalación experimental, tratamiento de aguas residuales, radiación ultravioleta, árbol de fallos.______________________________________________________________________________AbstractOne of the modern requirements for using equipments in all the areas of economy, science and education, is its safety operation. In this work, it was carried out the reliability and risk analysis of an experimental setup for the wastewater treatment with ultraviolet radiation. The fault tree technique was used and two variants of calculation were analyzed. The first variant considered unreliable sources of electricity supply and the second considered the existence of reliable sources. As a result, 20 minimal cut sets were identified 12 of first-order and 8 of third-order. Besides, the necessity of an alternative supply electrical power source was inferred and it is important to establish redundant components group for industrial scalefacilities. The analysis demostrated the set

  13. Treatment of secondary burn wound progression in contact burns-a systematic review of experimental approaches.

    Science.gov (United States)

    Schmauss, Daniel; Rezaeian, Farid; Finck, Tom; Machens, Hans-Guenther; Wettstein, Reto; Harder, Yves

    2015-01-01

    After a burn injury, superficial partial-thickness burn wounds may progress to deep partial-thickness or full-thickness burn wounds, if kept untreated. This phenomenon is called secondary burn wound progression or conversion. Burn wound depth is an important determinant of patient morbidity and mortality. Therefore, reduction or even the prevention of secondary burn wound progression is one goal of the acute care of burned patients. The objective of this study was to review preclinical approaches evaluating therapies to reduce burn wound progression. A systematic review of experimental approaches in animals that aim at reducing or preventing secondary burn wound progression was performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analysis (PRISMA) guidelines. The selected references consist of all the peer-reviewed studies performed in vivo in animals and review articles published in English, German, Italian, Spanish, or French language relevant to the topic of secondary burn wound progression. We searched MEDLINE, Cochrane Library, and Google Scholar including all the articles published from the beginning of notations to the present. The search was conducted between May 3, 2012 and December 26, 2013. We included 29 experimental studies in this review, investigating agents that maintain or increase local perfusion conditions, as well as agents that exhibit an anti-coagulatory, an anti-inflammatory, or an anti-apoptotic property. Warm water, simvastatin, EPO, or cerium nitrate may represent particularly promising approaches for the translation into clinical use in the near future. This review demonstrates promising experimental approaches that might reduce secondary burn wound progression. Nevertheless, a translation into clinical application needs to confirm the results compiled in experimental animal studies.

  14. Wave Energy Converters : An experimental approach to onshore testing, deployments and offshore monitoring

    OpenAIRE

    Ulvgård, Liselotte

    2017-01-01

    The wave energy converter (WEC) concept developed at Uppsala University consists of a point absorbing buoy, directly connected to a permanent magnet linear generator. Since 2006, over a dozen full scale WECs have been deployed at the Lysekil Research Site, on the west coast of Sweden. Beyond the development of the WEC concept itself, the full scale approach enables, and requires, experimental and multidisciplinary research within several peripheral areas, such as instrumentation, offshore ope...

  15. An experimental approach to improve the basin type solar still using an integrated natural circulation loop

    International Nuclear Information System (INIS)

    Rahmani, Ahmed; Boutriaa, Abdelouahab; Hadef, Amar

    2015-01-01

    Highlights: • A new experimental approach to improve the conventional solar still performances is proposed. • A passive natural circulation loop is integrated to the conventional solar still. • Natural circulation of humid-air in a closed loop is studied by the present study. • Natural circulation capability in driving air convection in the still was demonstrated. • Air convection created inside the still increase the evaporation heat and mass transfer. - Abstract: In this paper, a new experimental approach is proposed to enhance the performances of the conventional solar still using the natural circulation effect inside the still. The idea consists in generating air flow by a rectangular natural circulation loop appended to the rear side of the still. The proposed still was tested during summer period and the experimental data presented in this paper concerns four typical days. The convective heat transfer coefficient is evaluated and compared with Dunkle’s model. The comparison shows that convective heat transfer is considerably improved by the air convection created inside the still. The natural circulation phenomenon in the still is studied and a good agreement between the experimental data and Vijajan’s laminar correlation is found. Therefore, natural circulation phenomenon is found to have a good effect on the still performances where the still daily productivity is of 3.72 kg/m 2 and the maximum efficiency is of 45.15%

  16. An enhanced unified uncertainty analysis approach based on first order reliability method with single-level optimization

    International Nuclear Information System (INIS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; Tooren, Michel van

    2013-01-01

    In engineering, there exist both aleatory uncertainties due to the inherent variation of the physical system and its operational environment, and epistemic uncertainties due to lack of knowledge and which can be reduced with the collection of more data. To analyze the uncertain distribution of the system performance under both aleatory and epistemic uncertainties, combined probability and evidence theory can be employed to quantify the compound effects of the mixed uncertainties. The existing First Order Reliability Method (FORM) based Unified Uncertainty Analysis (UUA) approach nests the optimization based interval analysis in the improved Hasofer–Lind–Rackwitz–Fiessler (iHLRF) algorithm based Most Probable Point (MPP) searching procedure, which is computationally inhibitive for complex systems and may encounter convergence problem as well. Therefore, in this paper it is proposed to use general optimization solvers to search MPP in the outer loop and then reformulate the double-loop optimization problem into an equivalent single-level optimization (SLO) problem, so as to simplify the uncertainty analysis process, improve the robustness of the algorithm, and alleviate the computational complexity. The effectiveness and efficiency of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem. -- Highlights: ► Uncertainty analysis under mixed aleatory and epistemic uncertainties is studied. ► A unified uncertainty analysis method is proposed with combined probability and evidence theory. ► The traditional nested analysis method is converted to single level optimization for efficiency. ► The effectiveness and efficiency of the proposed method are testified with three examples

  17. A practical approach for calculating reliable cost estimates from observational data: application to cost analyses in maternal and child health.

    Science.gov (United States)

    Salemi, Jason L; Comins, Meg M; Chandler, Kristen; Mogos, Mulubrhan F; Salihu, Hamisu M

    2013-08-01

    Comparative effectiveness research (CER) and cost-effectiveness analysis are valuable tools for informing health policy and clinical care decisions. Despite the increased availability of rich observational databases with economic measures, few researchers have the skills needed to conduct valid and reliable cost analyses for CER. The objectives of this paper are to (i) describe a practical approach for calculating cost estimates from hospital charges in discharge data using publicly available hospital cost reports, and (ii) assess the impact of using different methods for cost estimation in maternal and child health (MCH) studies by conducting economic analyses on gestational diabetes (GDM) and pre-pregnancy overweight/obesity. In Florida, we have constructed a clinically enhanced, longitudinal, encounter-level MCH database covering over 2.3 million infants (and their mothers) born alive from 1998 to 2009. Using this as a template, we describe a detailed methodology to use publicly available data to calculate hospital-wide and department-specific cost-to-charge ratios (CCRs), link them to the master database, and convert reported hospital charges to refined cost estimates. We then conduct an economic analysis as a case study on women by GDM and pre-pregnancy body mass index (BMI) status to compare the impact of using different methods on cost estimation. Over 60 % of inpatient charges for birth hospitalizations came from the nursery/labor/delivery units, which have very different cost-to-charge markups (CCR = 0.70) than the commonly substituted hospital average (CCR = 0.29). Using estimated mean, per-person maternal hospitalization costs for women with GDM as an example, unadjusted charges ($US14,696) grossly overestimated actual cost, compared with hospital-wide ($US3,498) and department-level ($US4,986) CCR adjustments. However, the refined cost estimation method, although more accurate, did not alter our conclusions that infant/maternal hospitalization costs

  18. Study of Monte Carlo approach to experimental uncertainty propagation with MSTW 2008 PDFs

    CERN Document Server

    Watt, G.

    2012-01-01

    We investigate the Monte Carlo approach to propagation of experimental uncertainties within the context of the established 'MSTW 2008' global analysis of parton distribution functions (PDFs) of the proton at next-to-leading order in the strong coupling. We show that the Monte Carlo approach using replicas of the original data gives PDF uncertainties in good agreement with the usual Hessian approach using the standard Delta(chi^2) = 1 criterion, then we explore potential parameterisation bias by increasing the number of free parameters, concluding that any parameterisation bias is likely to be small, with the exception of the valence-quark distributions at low momentum fractions x. We motivate the need for a larger tolerance, Delta(chi^2) > 1, by making fits to restricted data sets and idealised consistent or inconsistent pseudodata. Instead of using data replicas, we alternatively produce PDF sets randomly distributed according to the covariance matrix of fit parameters including appropriate tolerance values,...

  19. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  20. Approach to Operational Experimental Estimation of Static Stresses of Elements of Mechanical Structures

    Science.gov (United States)

    Sedov, A. V.; Kalinchuk, V. V.; Bocharova, O. V.

    2018-01-01

    The evaluation of static stresses and strength of units and components is a crucial task for increasing reliability in the operation of vehicles and equipment, to prevent emergencies, especially in structures made of metal and composite materials. At the stage of creation and commissioning of structures to control the quality of manufacturing of individual elements and components, diagnostic control methods are widely used. They are acoustic, ultrasonic, X-ray, radiation methods and others. The using of these methods to control the residual life and the degree of static stresses of units and parts during operation is fraught with great difficulties both in methodology and in instrumentation. In this paper, the authors propose an effective approach of operative control of the degree of static stresses of units and parts of mechanical structures which are in working condition, based on recording the changing in the surface wave properties of a system consisting of a sensor and a controlled environment (unit, part). The proposed approach of low-frequency diagnostics of static stresses presupposes a new adaptive-spectral analysis of a surface wave created by external action (impact). It is possible to estimate implicit stresses of structures in the experiment due to this approach.

  1. Improved microbial conversion of de-oiled Jatropha waste into biohydrogen via inoculum pretreatment: process optimization by experimental design approach

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Kumar

    2015-03-01

    Full Text Available In this study various pretreatment methods of sewage sludge inoculum and the statistical process optimization of de-oiled jatropha waste have been reported. Peak hydrogen production rate (HPR and hydrogen yield (HY of 0.36 L H2/L-d and 20 mL H2/g Volatile Solid (VS were obtained when heat shock pretreatment (95 oC, 30 min was employed. Afterwards, an experimental design was applied to find the optimal conditions for H2 production using heat-pretreated seed culture. The optimal substrate concentration, pH and temperature were determined by using response surface methodology as 205 g/L, 6.53 and 55.1 oC, respectively. Under these circumstances, the highest HPR of 1.36 L H2/L-d was predicted. Verification tests proved the reliability of the statistical approach. As a result of the heat pretreatment and fermentation optimization, a significant (~ 4 folds increase in HPR was achieved. PCR-DGGE results revealed that Clostridium sp. were majorly present under the optimal conditions.

  2. Experimental and analytical combined thermal approach for local tribological understanding in metal cutting

    International Nuclear Information System (INIS)

    Artozoul, Julien; Lescalier, Christophe; Dudzinski, Daniel

    2015-01-01

    Metal cutting is a highly complex thermo-mechanical process. The knowledge of temperature in the chip forming zone is essential to understand it. Conventional experimental methods such as thermocouples only provide global information which is incompatible with the high stress and temperature gradients met in the chip forming zone. Field measurements are essential to understand the localized thermo-mechanical problem. An experimental protocol has been developed using advanced infrared imaging in order to measure temperature distribution in both the tool and the chip during an orthogonal or oblique cutting operation. It also provides several information on the chip formation process such as some geometrical characteristics (tool-chip contact length, chip thickness, primary shear angle) and thermo-mechanical information (heat flux dissipated in deformation zone, local interface heat partition ratio). A study is carried out on the effects of cutting conditions i.e. cutting speed, feed and depth of cut on the temperature distribution along the contact zone for an elementary operation. An analytical thermal model has been developed to process experimental data and access more information i.e. local stress or heat flux distribution. - Highlights: • A thermal analytical model is proposed for orthogonal cutting process. • IR thermography is used during cutting tests. • Combined experimental and modeling approaches are applied. • Heat flux and stress distribution at the tool-chip interface are determined. • The decomposition into sticking and sliding zones is defined.

  3. Experimental approach towards shell structure at 100Sn and 78Ni

    International Nuclear Information System (INIS)

    Grawe, H.; Gorska, M.; Fahlander, C.

    2000-07-01

    The status of experimental approach to 100 Sn and 78 Ni is reviewed. Revised single particle energies for neutrons are deduced for the N=Z=50 shell closure and evidence for low lying I π =2 + and 3 - states is presented. Moderate E2 polarisation charges of 0.1 e and 0.6 e are found to reproduce the experimental data when core excitation of 100 Sn is properly accounted for in the shell model. For the neutron rich Ni region no conclusive evidence for a N=40 subshell is found, whereas firm evidence for the persistence of the N=50 shell at 78 Ni is inferred from the existence of seniority isomers. The disappearance of this isomerism in the mid νg 9/2 shell is discussed. (orig.)

  4. Paul Baillon presents the book "Differential manifolds: a basic approach for experimental physicists" | 25 March

    CERN Multimedia

    CERN Library

    2014-01-01

    Tuesday 25 March 2014 at 4 p.m. in the Library, bldg. 52-1-052 "Differential manifolds: a basic approach for experimental physicists" by Paul Baillon,  World Scientific, 2013, ISBN 978-981-4449-56-4. Differential manifold is the framework of particle physics and astrophysics nowadays. It is important for all research physicists to be accustomed to it, and even experimental physicists should be able to manipulate equations and expressions in this framework. This book gives a comprehensive description of the basics of differential manifold with a full proof of elements. A large part of the book is devoted to the basic mathematical concepts, which are all necessary for the development of the differential manifold. This book is self-consistent; it starts from first principles. The mathematical framework is the set theory with its axioms and its formal logic. No special knowledge is needed. Coffee will be served from 3.30 p.m.

  5. Deformation behavior of dragonfly-inspired nodus structured wing in gliding flight through experimental visualization approach.

    Science.gov (United States)

    Zhang, Sheng; Sunami, Yuta; Hashimoto, Hiromu

    2018-04-10

    Dragonfly has excellent flight performance and maneuverability due to the complex vein structure of wing. In this research, nodus as an important structural element of the dragonfly wing is investigated through an experimental visualization approach. Three vein structures were fabricated as, open-nodus structure, closed-nodus structure (with a flex-limiter) and rigid wing. The samples were conducted in a wind tunnel with a high speed camera to visualize the deformation of wing structure in order to study the function of nodus structured wing in gliding flight. According to the experimental results, nodus has a great influence on the flexibility of the wing structure. Moreover, the closed-nodus wing (with a flex-limiter) enables the vein structure to be flexible without losing the strength and rigidity of the joint. These findings enhance the knowledge of insect-inspired nodus structured wing and facilitate the application of Micro Air Vehicle (MAV) in gliding flight.

  6. Forming Limits in Sheet Metal Forming for Non-Proportional Loading Conditions - Experimental and Theoretical Approach

    International Nuclear Information System (INIS)

    Ofenheimer, Aldo; Buchmayr, Bruno; Kolleck, Ralf; Merklein, Marion

    2005-01-01

    The influence of strain paths (loading history) on material formability is well known in sheet forming processes. Sophisticated experimental methods are used to determine the entire shape of strain paths of forming limits for aluminum AA6016-T4 alloy. Forming limits for sheet metal in as-received condition as well as for different pre-deformation are presented. A theoretical approach based on Arrieux's intrinsic Forming Limit Stress Curve (FLSC) concept is employed to numerically predict the influence of loading history on forming severity. The detailed experimental strain paths are used in the theoretical study instead of any linear or bilinear simplified loading histories to demonstrate the predictive quality of forming limits in the state of stress

  7. Experimental approaches for the development of gamma spectroscopy well logging system

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jehyun; Hwang, Seho; Kim, Jongman [Korea Institute of Geoscience and Mineral Resources (124 Gwahang-no, Yuseong-gu, Daejeon, Korea) (Korea, Republic of); Won, Byeongho [Heesong Geotek Co., Ltd (146-8 Sangdaewon-dong, Jungwon-gu, Seongnam-si, Gyeonggi-do, Korea) (Korea, Republic of)

    2015-03-10

    This article discusses experimental approaches for the development of gamma spectroscopy well logging system. Considering the size of borehole sonde, we customize 2 x 2 inches inorganic scintillators and the system including high voltage, preamplifier, amplifier and multichannel analyzer (MCA). The calibration chart is made by test using standard radioactive sources so that the measured count rates are expressed by energy spectrum. Optimum high-voltage supplies and the measurement parameters of each detector are set up by experimental investigation. Also, the responses of scintillation detectors have been examined by analysis according to the distance between source and detector. Because gamma spectroscopy well logging needs broad spectrum, high sensitivity and resolution, the energy resolution and sensitivity as a function of gamma ray energy are investigated by analyzing the gamma ray activities of the radioactive sources.

  8. Development of an experimental approach to study coupled soil-plant-atmosphere processes using plant analogs

    Science.gov (United States)

    Trautz, Andrew C.; Illangasekare, Tissa H.; Rodriguez-Iturbe, Ignacio; Heck, Katharina; Helmig, Rainer

    2017-04-01

    The atmosphere, soils, and vegetation near the land-atmosphere interface are in a state of continuous dynamic interaction via a myriad of complex interrelated feedback processes which collectively, remain poorly understood. Studying the fundamental nature and dynamics of such processes in atmospheric, ecological, and/or hydrological contexts in the field setting presents many challenges; current experimental approaches are an important factor given a general lack of control and high measurement uncertainty. In an effort to address these issues and reduce overall complexity, new experimental design considerations (two-dimensional intermediate-scale coupled wind tunnel-synthetic aquifer testing using synthetic plants) for studying soil-plant-atmosphere continuum soil moisture dynamics are introduced and tested in this study. Validation of these experimental considerations, particularly the adoption of synthetic plants, is required prior to their application in future research. A comparison of three experiments with bare soil surfaces or transplanted with a Stargazer lily/limestone block was used to evaluate the feasibility of the proposed approaches. Results demonstrate that coupled wind tunnel-porous media experimentation, used to simulate field conditions, reduces complexity, and enhances control while allowing fine spatial-temporal resolution measurements to be made using state-of-the-art technologies. Synthetic plants further help reduce system complexity (e.g., airflow) while preserving the basic hydrodynamic functions of plants (e.g., water uptake and transpiration). The trends and distributions of key measured atmospheric and subsurface spatial and temporal variables (e.g., soil moisture, relative humidity, temperature, air velocity) were comparable, showing that synthetic plants can be used as simple, idealized, nonbiological analogs for living vegetation in fundamental hydrodynamic studies.

  9. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    Ryan, T.G.

    1988-01-01

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  10. Feasibility Study of a Simulation Driven Approach for Estimating Reliability of Wind Turbine Fluid Power Pitch Systems

    DEFF Research Database (Denmark)

    Liniger, Jesper; Pedersen, Henrik Clemmensen; N. Soltani, Mohsen

    2018-01-01

    Recent field data indicates that pitch systems account for a substantial part of a wind turbines down time. Reducing downtime means increasing the total amount of energy produced during its lifetime. Both electrical and fluid power pitch systems are employed with a roughly 50/50 distribution. Fluid...... power pitch systems generally show higher reliability and have been favored on larger offshore wind turbines. Still general issues such as leakage, contamination and electrical faults make current systems work sub-optimal. Current field data for wind turbines present overall pitch system reliability...... and the reliability of component groups (valves, accumulators, pumps etc.). However, the failure modes of the components and more importantly the root causes are not evident. The root causes and failure mode probabilities are central for changing current pitch system designs and operational concepts to increase...

  11. A comparative study of the probabilistic fracture mechanics and the stochastic Markovian process approaches for structural reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1990-01-01

    The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).

  12. Impact of Climate Change on Natural Snow Reliability, Snowmaking Capacities, and Wind Conditions of Ski Resorts in Northeast Turkey: A Dynamical Downscaling Approach

    Directory of Open Access Journals (Sweden)

    Osman Cenk Demiroglu

    2016-04-01

    Full Text Available Many ski resorts worldwide are going through deteriorating snow cover conditions due to anthropogenic warming trends. As the natural and the artificially supported, i.e., technical, snow reliability of ski resorts diminish, the industry approaches a deadlock. For this reason, impact assessment studies have become vital for understanding vulnerability of ski tourism. This study considers three resorts at one of the rapidly emerging ski destinations, Northeast Turkey, for snow reliability analyses. Initially one global circulation model is dynamically downscaled by using the regional climate model RegCM4.4 for 1971–2000 and 2021–2050 periods along the RCP4.5 greenhouse gas concentration pathway. Next, the projected climate outputs are converted into indicators of natural snow reliability, snowmaking capacity, and wind conditions. The results show an overall decline in the frequencies of naturally snow reliable days and snowmaking capacities between the two periods. Despite the decrease, only the lower altitudes of one ski resort would face the risk of losing natural snow reliability and snowmaking could still compensate for forming the base layer before the critical New Year’s week. On the other hand, adverse high wind conditions improve as to reduce the number of lift closure days at all resorts. Overall, this particular region seems to be relatively resilient against climate change.

  13. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  14. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    Science.gov (United States)

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  15. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2017-03-01

    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  16. ESTIMATION OF PARAMETERS AND RELIABILITY FUNCTION OF EXPONENTIATED EXPONENTIAL DISTRIBUTION: BAYESIAN APPROACH UNDER GENERAL ENTROPY LOSS FUNCTION

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Singh

    2011-06-01

    Full Text Available In this Paper we propose Bayes estimators of the parameters of Exponentiated Exponential distribution and Reliability functions under General Entropy loss function for Type II censored sample. The proposed estimators have been compared with the corresponding Bayes estimators obtained under Squared Error loss function and maximum likelihood estimators for their simulated risks (average loss over sample space.

  17. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors.

    Directory of Open Access Journals (Sweden)

    Anna Cichonska

    2017-08-01

    Full Text Available Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001 between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel

  18. Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.

    Science.gov (United States)

    Lee, Jaime B; Cherney, Leora R

    2018-03-01

    Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.

  19. Breaking Through the Glass Ceiling: Recent Experimental Approaches to Probe the Properties of Supercooled Liquids near the Glass Transition.

    Science.gov (United States)

    Smith, R Scott; Kay, Bruce D

    2012-03-15

    Experimental measurements of the properties of supercooled liquids at temperatures near their glass transition temperatures, Tg, are requisite for understanding the behavior of glasses and amorphous solids. Unfortunately, many supercooled molecular liquids rapidly crystallize at temperatures far above their Tg, making such measurements difficult to nearly impossible. In this Perspective, we discuss some recent alternative approaches to obtain experimental data in the temperature regime near Tg. These new approaches may yield the additional experimental data necessary to test current theoretical models of the dynamical slowdown that occurs in supercooled liquids approaching the glass transition.

  20. Electrochemical production and use of free chlorine for pollutant removal: an experimental design approach.

    Science.gov (United States)

    Antonelli, Raissa; de Araújo, Karla Santos; Pires, Ricardo Francisco; Fornazari, Ana Luiza de Toledo; Granato, Ana Claudia; Malpass, Geoffroy Roger Pointer

    2017-10-28

    The present paper presents the study of (1) the optimization of electrochemical-free chlorine production using an experimental design approach, and (2) the application of the optimum conditions obtained for the application in photo-assisted electrochemical degradation of simulated textile effluent. In the experimental design the influence of inter-electrode gap, pH, NaCl concentration and current was considered. It was observed that the four variables studied are significant for the process, with NaCl concentration and current being the most significant variables for free chlorine production. The maximum free chlorine production was obtained at a current of 2.33 A and NaCl concentrations in 0.96 mol dm -3 . The application of the optimized conditions with simultaneous UV irradiation resulted in up to 83.1% Total Organic Carbon removal and 100% of colour removal over 180 min of electrolysis. The results indicate that a systematic (statistical) approach to the electrochemical treatment of pollutants can save time and reagents.

  1. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  2. Filling Source Feedthrus with Alumina/Molybdenum CND50 Cermet: Experimental, Theoretical, and Computational Approaches

    International Nuclear Information System (INIS)

    STUECKER, JOHN N.; CESARANO III, JOSEPH; CORRAL, ERICA LORRANE; SHOLLENBERGER, KIM ANN; ROACH, R. ALLEN; TORCZYNSKI, JOHN R.; THOMAS, EDWARD V.; VAN ORNUM, DAVID J.

    2001-01-01

    This report is a summary of the work completed in FY00 for science-based characterization of the processes used to fabricate cermet vias in source feedthrus. In particular, studies were completed to characterize the CND50 cermet slurry, characterize solvent imbibition, and identify critical via filling variables. These three areas of interest are important to several processes pertaining to the production of neutron generator tubes. Rheological characterization of CND50 slurry prepared with 94ND2 and Sandi94 primary powders were also compared. The 94ND2 powder was formerly produced at the GE Pinellas Plant and the Sandi94 is the new replacement powder produced at CeramTec. Processing variables that may effect the via-filling process were also studied and include: the effect of solids loading in the CND50 slurry; the effect of milling time; and the effect of Nuosperse (a slurry ''conditioner''). Imbibition characterization included a combination of experimental, theoretical, and computational strategies to determine solvent migration though complex shapes, specifically vias in the source feedthru component. Critical factors were determined using a controlled set of experiments designed to identify those variables that influence the occurrence of defects within the cermet filled via. These efforts were pursued to increase part production reliability, understand selected fundamental issues that impact the production of slurry-filled parts, and validate the ability of the computational fluid dynamics code, GOMA, to simulate these processes. Suggestions are made for improving the slurry filling of source feedthru vias

  3. Experimental study of brachial plexus and vessel compression: evaluation of combined central and peripheral electrodiagnostic approach.

    Science.gov (United States)

    Yang, Chaoqun; Xu, Jianguang; Chen, Jie; Li, Shulin; Cao, Yu; Zhu, Yi; Xu, Lei

    2017-08-01

    We sought to investigate the reliability of a new electrodiagnostic method for identifying Electrodiagnosis of Brachial Plexus & Vessel Compression Syndrome (BPVCS) in rats that involves the application of transcranial electrical stimulation motor evoked potentials (TES-MEPs) combined with peripheral nerve stimulation compound muscle action potentials (PNS-CMAPs). The latencies of the TES-MEP and PNS-CMAP were initially elongated in the 8-week group. The amplitudes of TES-MEP and PNS-CMAP were initially attenuated in the 16-week group. The isolateral amplitude ratio of the TES-MEP to the PNS-CMAP was apparently decreased, and spontaneous activities emerged at 16 weeks postoperatively. Superior and inferior trunk models of BPVCS were created in 72 male Sprague Dawley (SD) rats that were divided into six experimental groups. The latencies, amplitudes and isolateral amplitude ratios of the TES-MEPs and PNS-CMAPs were recorded at different postoperative intervals. Electrophysiological and histological examinations of the rats' compressed brachial plexus nerves were utilized to establish preliminary electrodiagnostic criteria for BPVCS.

  4. New approaches for the reliability-oriented structural optimization considering time-variant aspects; Neue Ansaetze fuer die zuverlaessigkeitsorientierte Strukturoptimierung unter Beachtung zeitvarianter Aspekte

    Energy Technology Data Exchange (ETDEWEB)

    Kuschel, N.

    2000-07-01

    The optimization of structures with respect to cost, weight or performance is a well-known application of the nonlinear optimization. However reliability-based structural optimization has been subject of only very few studies. The approaches suggested up to now have been unsatisfactory regarding general possibility of application or easy handling by user. The objective of this thesis is the development of general approaches to solve both optimization problems, the minimization of cost with respect to constraint reliabilty and the maximization of reliability under cost constraint. The extented approach of an one-level-method will be introduced in detail for the time-invariant problems. Here, the reliability of the sturcture will be analysed in the framework of the First-Order-Reliability-Method (FORM). The use of time-variant reliability analysis is necessary for a realistic modelling of many practical problems. Therefore several generalizations of the new approaches will be derived for the time-variant reliability-based structural optimization. Some important properties of the optimization problems are proved. In addition some interesting extensions of the one-level-method, for example the cost optimization of structural series systems and the cost optimization in the frame of the Second-Order-Reliabiity-Method (SORM), are presented in the thesis. (orig.) [German] Die Optimierung von Tragwerken im Hinblick auf die Kosten, das Gewicht oder die Gestalt ist eine sehr bekannte Anwendung der nichtlinearen Optimierung. Die zuverlaessigkeitsorientierte Strukturoptimierung wurde dagegen weit seltener untersucht. Die bisher vorgeschlagenen Ansaetze koennen bezueglich ihrer allgemeinen Verwendbarkeit oder ihrer nutzerfreundlichen Handhabung nicht befriedigen. Das Ziel der vorliegenden Arbeit ist nun die Entwicklung allgemeiner Ansaetze zur Loesung der beiden Optimierungsprobleme, einer Kostenminimierung unter Zuverlaessigkeitsrestriktionen und einer

  5. Random Fuzzy Extension of the Universal Generating Function Approach for the Reliability Assessment of Multi-State Systems Under Aleatory and Epistemic Uncertainties

    DEFF Research Database (Denmark)

    Li, Yan-Fu; Ding, Yi; Zio, Enrico

    2014-01-01

    . In this work, we extend the traditional universal generating function (UGF) approach for multi-state system (MSS) availability and reliability assessment to account for both aleatory and epistemic uncertainties. First, a theoretical extension, named hybrid UGF (HUGF), is made to introduce the use of random...... fuzzy variables (RFVs) in the approach. Second, the composition operator of HUGF is defined by considering simultaneously the probabilistic convolution and the fuzzy extension principle. Finally, an efficient algorithm is designed to extract probability boxes ($p$ -boxes) from the system HUGF, which...

  6. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Paulo Adriano da; Saldanha, Pedro L.C., E-mail: pasilva@cnen.gov.b, E-mail: Saldanha@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil). Coord. Geral de Reatores Nucleares; Melo, Paulo F. Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao em Engenharia. Programa de Engenharia Nuclear; Araujo, Ademir L. de [Associacao Brasileira de Ensino Universitario (UNIABEU), Angra dos Reis, RJ (Brazil)

    2011-07-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  7. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Silva, Paulo Adriano da; Saldanha, Pedro L.C.

    2011-01-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  8. A reliability index for assessment of crack profile reconstructed from ECT signals using a neural-network approach

    International Nuclear Information System (INIS)

    Yusa, Noritaka; Chen, Zhenmao; Miya, Kenzo; Cheng, Weiying

    2002-01-01

    This paper proposes a reliability parameter to enhance an version scheme developed by authors. The scheme is based upon an artificial neural network that simulates mapping between eddy current signals and crack profiles. One of the biggest advantages of the scheme is that it can deal with conductive cracks, which is necessary to reconstruct natural cracks. However, it has one significant disadvantage: the reliability of reconstructed profiles was unknown. The parameter provides an index for assessment of the crack profile and overcomes this disadvantage. After the parameter is validated by reconstruction of simulated cracks, it is applied to reconstruction of natural cracks that occurred in steam generator tubes of a pressurized water reactor. It is revealed that the parameter is applicable to not only simulated cracks but also natural ones. (author)

  9. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  10. Frequency-Dependent Streaming Potential of Porous Media—Part 1: Experimental Approaches and Apparatus Design

    Directory of Open Access Journals (Sweden)

    P. W. J. Glover

    2012-01-01

    Full Text Available Electrokinetic phenomena link fluid flow and electrical flow in porous and fractured media such that a hydraulic flow will generate an electrical current and vice versa. Such a link is likely to be extremely useful, especially in the development of the electroseismic method. However, surprisingly few experimental measurements have been carried out, particularly as a function of frequency because of their difficulty. Here we have considered six different approaches to make laboratory determinations of the frequency-dependent streaming potential coefficient. In each case, we have analyzed the mechanical, electrical, and other technical difficulties involved in each method. We conclude that the electromagnetic drive is currently the only approach that is practicable, while the piezoelectric drive may be useful for low permeability samples and at specified high frequencies. We have used the electro-magnetic drive approach to design, build, and test an apparatus for measuring the streaming potential coefficient of unconsolidated and disaggregated samples such as sands, gravels, and soils with a diameter of 25.4 mm and lengths between 50 mm and 300 mm.

  11. Three experimental approaches to measure the social context dependence of prejudice communication and discriminatory behavior.

    Science.gov (United States)

    Beyer, Heiko; Liebe, Ulf

    2015-01-01

    Empirical research on discrimination is faced with crucial problems stemming from the specific character of its object of study. In democratic societies the communication of prejudices and other forms of discriminatory behavior is considered socially undesirable and depends on situational factors such as whether a situation is considered private or whether a discriminatory consensus can be assumed. Regular surveys thus can only offer a blurred picture of the phenomenon. But also survey experiments intended to decrease the social desirability bias (SDB) so far failed in systematically implementing situational variables. This paper introduces three experimental approaches to improve the study of discrimination and other topics of social (un-)desirability. First, we argue in favor of cognitive context framing in surveys in order to operationalize the salience of situational norms. Second, factorial surveys offer a way to take situational contexts and substitute behavior into account. And third, choice experiments - a rather new method in sociology - offer a more valid method of measuring behavioral characteristics compared to simple items in surveys. All three approaches - which may be combined - are easy to implement in large-scale surveys. Results of empirical studies demonstrate the fruitfulness of each of these approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Experimental Validation of a Differential Variational Inequality-Based Approach for Handling Friction and Contact in Vehicle

    Science.gov (United States)

    2015-11-20

    terrain modeled using the discrete element method (DEM). Experimental Validation of a Differential Variational Inequality -Based Approach for Handling...COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Experimental Validation of a Differential Variational Inequality -Based Approach for...sinkage, and single wheel tests. 1.1. Modeling Frictional Contact Via Differential Variational Inequalities Consider a three dimensional (3D) system of

  13. Humidity adsorption and transfer in hygroscopic materials. Percolation-type approach and experimentation

    International Nuclear Information System (INIS)

    Quenard, Daniel

    1989-01-01

    Water vapor adsorption and transfer in microporous media are studied by using a 3 level hierarchical approach. At the microscopic level (pore size), we describe the basic phenomena (adsorption/desorption, capillary condensation, molecular and Knudsen diffusion, Hagen-Poiseuille flow) that occur during the isotherm water vapor transport in a single cylindrical pore, at the steady state. The transport through a condensed pore is taken into account by its 'vapor equivalent flow' and we underline that capillary condensation may cause vapor flow amplification of several orders of magnitude. We suggest to use an electrical analogy between a cylindrical pore and a Zener diode. Then at the mesoscopic level (material size), we introduce pore networks to provide use with a simplified description of the microstructure. Three types of networks are studied: square, triangular and honeycomb. By using a random distribution of the single cylindrical pores on the 2D networks, we are able to estimate the sorption isotherms and the water vapor permeability which are the two essential characteristics to understand the behaviour of materials towards humidity. To develop this approach we refer to the percolation concept and we use most of its principal results. To estimate the adsorption isotherms we introduce a surface adsorption model and we use the KELVIN-LAPLACE equation. Hysteresis appears naturally thanks to the 'ink-bottle' phenomenon and it is all the more important since the network is ill-connected. The water vapor permeability is calculated thanks to the electrical analogy (cylindrical pore-Zener diode). We emphasize an important amplification of the equivalent permeability when the relative humidity reaches a threshold value. This phenomenon provides use with a possible explanation of numerous experimental results. The respective effects of pore size distribution and temperature, on sorption isotherms and permeability, are presented. We present several

  14. Reinforced concrete structures loaded by snow avalanches : numerical and experimental approaches.

    Science.gov (United States)

    Ousset, I.; Bertrand, D.; Brun, M.; Limam, A.; Naaim, M.

    2012-04-01

    Today, due to the extension of occupied areas in mountainous regions, new strategies for risk mitigation have to be developed. In the framework of risk analysis, these latter have to take into account not only the natural hazard description but also the physical vulnerability of the exposed structures. From a civil engineering point of view, the dynamic behavior of column or portico was widely investigated especially in the case of reinforced concrete and steel. However, it is not the case of reinforced concrete walls for which only the in-plan dynamic behavior (shear behavior) has been studied in detail in the field of earthquake engineering. Therefore, the aim of this project is to study the behavior of reinforced concrete civil engineering structures submitted to out-of-plan dynamic loadings coming from snow avalanche interaction. Numerical simulations in 2D or 3D by the finite element method (FEM) are presented. The approach allows solving mechanical problems in dynamic condition involving none linearities (especially none linear materials). Thus, the structure mechanical response can be explored in controlled conditions. First, a reinforced concrete wall with a L-like shape is considered. The structure is supposed to represent a French defense structure dedicated to protect people against snow avalanches. Experimental pushover tests have been performed on a physical model. The experimental tests consisted to apply a uniform distribution of pressure until the total collapse of the wall. A 2D numerical model has been developed to simulate the mechanical response of the structure under quasi-static loading. Numerical simulations have been compared to experimental datas and results gave a better understanding of the failure mode of the wall. Moreover, the influence of several parameters (geometry and the mechanical properties) is also presented. Secondly, punching shear experimental tests have also been carried out. Reinforced concrete slabs simply supported have

  15. Approaches to enhance the teaching quality of experimental biochemistry for MBBS students in TSMU, China.

    Science.gov (United States)

    Yu, Lijuan; Yi, Shuying; Zhai, Jing; Wang, Zhaojin

    2017-07-08

    With the internationalization of medical education in China, the importance of international students' education in medical schools is also increasing. Except foreign students majoring in Chinese language, English Bachelor of Medicine, Bachelor of Surgery (MBSS) students are the largest group of international students. Based on problems in the teaching process for experimental biochemistry, we designed teaching models adapted to the background of international students and strengthened teachers' teaching ability at Taishan Medical University. Several approaches were used in combination to promote teaching effects and increase the benefit of teaching to teachers. The primary data showed an increased passion for basic medical biochemistry and an improved theoretical background for MBSS students, which will be helpful for their later clinical medicine studies. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(4):360-364, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  16. Electromagnetic scattering problems -Numerical issues and new experimental approaches of validation

    Energy Technology Data Exchange (ETDEWEB)

    Geise, Robert; Neubauer, Bjoern; Zimmer, Georg [University of Braunschweig, Institute for Electromagnetic Compatibility, Schleinitzstrasse 23, 38106 Braunschweig (Germany)

    2015-03-10

    Electromagnetic scattering problems, thus the question how radiated energy spreads when impinging on an object, are an essential part of wave propagation. Though the Maxwell’s differential equations as starting point, are actually quite simple,the integral formulation of an object’s boundary conditions, respectively the solution for unknown induced currents can only be solved numerically in most cases.As a timely topic of practical importance the scattering of rotating wind turbines is discussed, the numerical description of which is still based on rigorous approximations with yet unspecified accuracy. In this context the issue of validating numerical solutions is addressed, both with reference simulations but in particular with the experimental approach of scaled measurements. For the latter the idea of an incremental validation is proposed allowing a step by step validation of required new mathematical models in scattering theory.

  17. A Sustainable, Reliable Mission-Systems Architecture that Supports a System of Systems Approach to Space Exploration

    Science.gov (United States)

    Watson, Steve; Orr, Jim; O'Neil, Graham

    2004-01-01

    A mission-systems architecture based on a highly modular "systems of systems" infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is absolutely essential for an affordable and sustainable space exploration program. This architecture requires (a) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimum sustaining engineering. This paper proposes such an architecture. Lessons learned from the space shuttle program are applied to help define and refine the model.

  18. Asymmetrical Responses of Ecosystem Processes to Positive Versus Negative Precipitation Extremes: a Replicated Regression Experimental Approach

    Science.gov (United States)

    Felton, A. J.; Smith, M. D.

    2016-12-01

    Heightened climatic variability due to atmospheric warming is forecast to increase the frequency and severity of climate extremes. In particular, changes to interannual variability in precipitation, characterized by increases in extreme wet and dry years, are likely to impact virtually all terrestrial ecosystem processes. However, to date experimental approaches have yet to explicitly test how ecosystem processes respond to multiple levels of climatic extremity, limiting our understanding of how ecosystems will respond to forecast increases in the magnitude of climate extremes. Here we report the results of a replicated regression experimental approach, in which we imposed 9 and 11 levels of growing season precipitation amount and extremity in mesic grassland during 2015 and 2016, respectively. Each level corresponded to a specific percentile of the long-term record, which produced a large gradient of soil moisture conditions that ranged from extreme wet to extreme dry. In both 2015 and 2016, asymptotic responses to water availability were observed for soil respiration. This asymmetry was driven in part by transitions between soil moisture versus temperature constraints on respiration as conditions became increasingly dry versus increasingly wet. In 2015, aboveground net primary production (ANPP) exhibited asymmetric responses to precipitation that largely mirrored those of soil respiration. In total, our results suggest that in this mesic ecosystem, these two carbon cycle processes were more sensitive to extreme drought than to extreme wet years. Future work will assess ANPP responses for 2016, soil nutrient supply and physiological responses of the dominant plant species. Future efforts are needed to compare our findings across a diverse array of ecosystem types, and in particular how the timing and magnitude of precipitation events may modify the response of ecosystem processes to increasing magnitudes of precipitation extremes.

  19. Thermodynamic study of 2-aminothiazole and 2-aminobenzothiazole: Experimental and computational approaches

    International Nuclear Information System (INIS)

    Silva, Ana L.R.; Monte, Manuel J.S.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2014-01-01

    Highlights: • Combustion of 2-aminothiazole and 2-aminobenzothiazole by rotating bomb calorimetry. • Enthalpies of sublimation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation of 2-aminothiazole and 2-aminobenzothiazole. • Gaseous enthalpies of formation calculated from high-level MO calculations. • Gas-phase enthalpies of formation estimated from G3(MP2)//B3LYP approach. - Abstract: This work reports an experimental and computational thermochemical study of two aminothiazole derivatives, namely 2-aminothiazole and 2-aminobenzothiazole. The standard (p° = 0.1 MPa) molar energies of combustion of these compounds were measured by rotating bomb combustion calorimetry. The standard molar enthalpies of sublimation, at T = 298.15 K, were derived from the temperature dependence of the vapor pressures of these compounds, measured by the Knudsen-effusion technique and from high temperature Calvet microcalorimetry. The conjugation of these experimental results enabled the calculation of the standard molar enthalpies of formation in the gaseous state, at T = 298.15 K, for the compounds studied. The corresponding standard Gibbs free energies of formation in crystalline and gaseous phases were also derived, allowing the analysis of their stability, in these phases. We have also estimated the gas-phase enthalpies of formation from high-level molecular orbital calculations at the G3(MP2)//B3LYP level of theory, the estimates revealing very good agreement with the experimental ones. The importance of some stabilizing electronic interactions occurring in the title molecules has been studied and quantitatively evaluated through Natural Bonding Orbital (NBO) of the corresponding wavefunctions and their Nucleus Independent Chemical Shifts (NICS) parameters have been calculated in order to rationalize the effect of electronic delocalization upon stability

  20. Interfacial separation of a mature biofilm from a glass surface - A combined experimental and cohesive zone modelling approach.

    Science.gov (United States)

    Safari, Ashkan; Tukovic, Zeljko; Cardiff, Philip; Walter, Maik; Casey, Eoin; Ivankovic, Alojz

    2016-02-01

    A good understanding of the mechanical stability of biofilms is essential for biofouling management, particularly when mechanical forces are used. Previous biofilm studies lack a damage-based theoretical model to describe the biofilm separation from a surface. The purpose of the current study was to investigate the interfacial separation of a mature biofilm from a rigid glass substrate using a combined experimental and numerical modelling approach. In the current work, the biofilm-glass interfacial separation process was investigated under tensile and shear stresses at the macroscale level, known as modes I and II failure mechanisms respectively. The numerical simulations were performed using a Finite Volume (FV)-based simulation package (OpenFOAM®) to predict the separation initiation using the cohesive zone model (CZM). Atomic force microscopy (AFM)-based retraction curve was used to obtain the separation properties between the biofilm and glass colloid at microscale level, where the CZM parameters were estimated using the Johnson-Kendall-Roberts (JKR) model. In this study CZM is introduced as a reliable method for the investigation of interfacial separation between a biofilm and rigid substrate, in which a high local stress at the interface edge acts as an ultimate stress at the crack tip.This study demonstrated that the total interfacial failure energy measured at the macroscale, was significantly higher than the pure interfacial separation energy obtained by AFM at the microscale, indicating a highly ductile deformation behaviour within the bulk biofilm matrix. The results of this study can significantly contribute to the understanding of biofilm detachments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A multi-scale experimental and simulation approach for fractured subsurface systems

    Science.gov (United States)

    Viswanathan, H. S.; Carey, J. W.; Frash, L.; Karra, S.; Hyman, J.; Kang, Q.; Rougier, E.; Srinivasan, G.

    2017-12-01

    Fractured systems play an important role in numerous subsurface applications including hydraulic fracturing, carbon sequestration, geothermal energy and underground nuclear test detection. Fractures that range in scale from microns to meters and their structure control the behavior of these systems which provide over 85% of our energy and 50% of US drinking water. Determining the key mechanisms in subsurface fractured systems has been impeded due to the lack of sophisticated experimental methods to measure fracture aperture and connectivity, multiphase permeability, and chemical exchange capacities at the high temperature, pressure, and stresses present in the subsurface. In this study, we developed and use microfluidic and triaxial core flood experiments required to reveal the fundamental dynamics of fracture-fluid interactions. In addition we have developed high fidelity fracture propagation and discrete fracture network flow models to simulate these fractured systems. We also have developed reduced order models of these fracture simulators in order to conduct uncertainty quantification for these systems. We demonstrate an integrated experimental/modeling approach that allows for a comprehensive characterization of fractured systems and develop models that can be used to optimize the reservoir operating conditions over a range of subsurface conditions.

  2. Experimental instruction in photonics for high school students: approaches to managing problems faced

    Science.gov (United States)

    Choong, Zhengyang

    2017-08-01

    Student research projects are increasingly common at the K-12 level. However, students often face difficulties in the course of their school research projects such as setting realistic timelines and expectations, handling problems stemming from a lack of self-confidence, as well as being sufficiently disciplined for sustained communication and experimentation. In this work, we explore manifestations of these problems in the context of a photonics project, characterising the spectrum of the breakdown flash from Silicon Avalanche Photodiodes. We report on the process of planning and building the setup, data collection, analysis and troubleshooting, as well as the technical and human problems at each step. Approaches that were found to be helpful in managing the aforementioned problems are discussed, including an attention to detail during experimental work, as well as communicating in a forthcoming manner. Œe former allowed for clearer planning and the setting of quantifiable proximal goals; the latter helped in motivating discipline, and also helped in the understanding of research as an iterative learning process without a clear definition of success or failure.

  3. A hybrid computational-experimental approach for automated crystal structure solution

    Science.gov (United States)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  4. Experimental study on distributed optical fiber-based approach monitoring saturation line in levee engineering

    Science.gov (United States)

    Su, Huaizhi; Li, Hao; Kang, Yeyuan; Wen, Zhiping

    2018-02-01

    Seepage is one of key factors which affect the levee engineering safety. The seepage danger without timely detection and rapid response may likely lead to severe accidents such as seepage failure, slope instability, and even levee break. More than 90 percent of levee break events are caused by the seepage. It is very important for seepage behavior identification to determine accurately saturation line in levee engineering. Furthermore, the location of saturation line has a major impact on slope stability in levee engineering. Considering the structure characteristics and service condition of levee engineering, the distributed optical fiber sensing technology is introduced to implement the real-time observation of saturation line in levee engineering. The distributed optical fiber temperature sensor system (DTS)-based monitoring principle of saturation line in levee engineering is investigated. An experimental platform, which consists of DTS, heating system, water-supply system, auxiliary analysis system and levee model, is designed and constructed. The monitoring experiment of saturation line in levee model is implemented on this platform. According to the experimental results, the numerical relationship between moisture content and thermal conductivity in porous medium is identified. A line heat source-based distributed optical fiber method obtaining the thermal conductivity in porous medium is developed. A DTS-based approach is proposed to monitor the saturation line in levee engineering. The embedment pattern of optical fiber for monitoring saturation line is presented.

  5. Experimental and numerical approaches to studying hot cracking in stainless steel welds

    International Nuclear Information System (INIS)

    Le, Minh

    2014-01-01

    This work concerns experimental and numerical approaches to studying hot cracking in welds in stainless steel. Metallurgical weldability of two filler products used for the welding of an AISI-316L(N) austenitic stainless steel grade is evaluated. These filler metals are distinguished by their solidification microstructures: austeno-ferritic for the 19Cr-12Ni-2Mo grade and austenitic for the 19-15H Thermanit grade. The study of weldability concerns the assessment of the susceptibility to hot cracking of these three alloys, the proposition of a hot cracking criterion, and the evaluation of its transferability to structure-scale tests. Hot cracks are material separations occurring at high temperatures along the grain boundaries (dendrite boundaries), when the level of strain and the strain rate exceed a certain level. The hot cracks studied are formed during solidification from the liquid phase of weld metals. The bibliography study brings to the fore the complexity of initiation and propagation mechanisms of these material separations. Three types of tests are studied in this work: hot cracking tests, such as trapezoidal and Varestraint tests, allowing to initiate the phenomenon in controlled experimental conditions, and tests on the Gleeble thermomechanical simulator for thermomechanical (materials behavior laws, fracture properties) and metallurgical (brittle temperature range (BTR), evolution of delta ferrite) characterizations of the alloys. All these tests on the three materials were analyzed via numerical modeling and simulations implemented in the Cast3M finite element code in order to bring out a thermomechanical hot cracking criterion. (author) [fr

  6. A Unified Experimental Approach for Estimation of Irrigationwater and Nitrate Leaching in Tree Crops

    Science.gov (United States)

    Hopmans, J. W.; Kandelous, M. M.; Moradi, A. B.

    2014-12-01

    Groundwater quality is specifically vulnerable in irrigated agricultural lands in California and many other(semi-)arid regions of the world. The routine application of nitrogen fertilizers with irrigation water in California is likely responsible for the high nitrate concentrations in groundwater, underlying much of its main agricultural areas. To optimize irrigation/fertigation practices, it is essential that irrigation and fertilizers are applied at the optimal concentration, place, and time to ensure maximum root uptake and minimize leaching losses to the groundwater. The applied irrigation water and dissolved fertilizer, as well as root growth and associated nitrate and water uptake, interact with soil properties and fertilizer source(s) in a complex manner that cannot easily be resolved. It is therefore that coupled experimental-modeling studies are required to allow for unraveling of the relevant complexities that result from typical field-wide spatial variations of soil texture and layering across farmer-managed fields. We present experimental approaches across a network of tree crop orchards in the San Joaquin Valley, that provide the necessary soil data of soil moisture, water potential and nitrate concentration to evaluate and optimize irrigation water management practices. Specifically, deep tensiometers were used to monitor in-situ continuous soil water potential gradients, for the purpose to compute leaching fluxes of water and nitrate at both the individual tree and field scale.

  7. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  8. A unified approach to linking experimental, statistical and computational analysis of spike train data.

    Directory of Open Access Journals (Sweden)

    Liang Meng

    Full Text Available A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data, but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach--linking statistical, computational, and experimental neuroscience--provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.

  9. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    Science.gov (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  10. An experimental MOSFET approach to characterize (192)Ir HDR source anisotropy.

    Science.gov (United States)

    Toye, W C; Das, K R; Todd, S P; Kenny, M B; Franich, R D; Johnston, P N

    2007-09-07

    The dose anisotropy around a (192)Ir HDR source in a water phantom has been measured using MOSFETs as relative dosimeters. In addition, modeling using the EGSnrc code has been performed to provide a complete dose distribution consistent with the MOSFET measurements. Doses around the Nucletron 'classic' (192)Ir HDR source were measured for a range of radial distances from 5 to 30 mm within a 40 x 30 x 30 cm(3) water phantom, using a TN-RD-50 MOSFET dosimetry system with an active area of 0.2 mm by 0.2 mm. For each successive measurement a linear stepper capable of movement in intervals of 0.0125 mm re-positioned the MOSFET at the required radial distance, while a rotational stepper enabled angular displacement of the source at intervals of 0.9 degrees . The source-dosimeter arrangement within the water phantom was modeled using the standardized cylindrical geometry of the DOSRZnrc user code. In general, the measured relative anisotropy at each radial distance from 5 mm to 30 mm is in good agreement with the EGSnrc simulations, benchmark Monte Carlo simulation and TLD measurements where they exist. The experimental approach employing a MOSFET detection system of small size, high spatial resolution and fast read out capability allowed a practical approach to the determination of dose anisotropy around a HDR source.

  11. Experimental/analytical approaches to modeling, calibrating and optimizing shaking table dynamics for structural dynamic applications

    Science.gov (United States)

    Trombetti, Tomaso

    This thesis presents an Experimental/Analytical approach to modeling and calibrating shaking tables for structural dynamic applications. This approach was successfully applied to the shaking table recently built in the structural laboratory of the Civil Engineering Department at Rice University. This shaking table is capable of reproducing model earthquake ground motions with a peak acceleration of 6 g's, a peak velocity of 40 inches per second, and a peak displacement of 3 inches, for a maximum payload of 1500 pounds. It has a frequency bandwidth of approximately 70 Hz and is designed to test structural specimens up to 1/5 scale. The rail/table system is mounted on a reaction mass of about 70,000 pounds consisting of three 12 ft x 12 ft x 1 ft reinforced concrete slabs, post-tensioned together and connected to the strong laboratory floor. The slip table is driven by a hydraulic actuator governed by a 407 MTS controller which employs a proportional-integral-derivative-feedforward-differential pressure algorithm to control the actuator displacement. Feedback signals are provided by two LVDT's (monitoring the slip table relative displacement and the servovalve main stage spool position) and by one differential pressure transducer (monitoring the actuator force). The dynamic actuator-foundation-specimen system is modeled and analyzed by combining linear control theory and linear structural dynamics. The analytical model developed accounts for the effects of actuator oil compressibility, oil leakage in the actuator, time delay in the response of the servovalve spool to a given electrical signal, foundation flexibility, and dynamic characteristics of multi-degree-of-freedom specimens. In order to study the actual dynamic behavior of the shaking table, the transfer function between target and actual table accelerations were identified using experimental results and spectral estimation techniques. The power spectral density of the system input and the cross power spectral

  12. Probing the mutational interplay between primary and promiscuous protein functions: a computational-experimental approach.

    Science.gov (United States)

    Garcia-Seisdedos, Hector; Ibarra-Molero, Beatriz; Sanchez-Ruiz, Jose M

    2012-01-01

    Protein promiscuity is of considerable interest due its role in adaptive metabolic plasticity, its fundamental connection with molecular evolution and also because of its biotechnological applications. Current views on the relation between primary and promiscuous protein activities stem largely from laboratory evolution experiments aimed at increasing promiscuous activity levels. Here, on the other hand, we attempt to assess the main features of the simultaneous modulation of the primary and promiscuous functions during the course of natural evolution. The computational/experimental approach we propose for this task involves the following steps: a function-targeted, statistical coupling analysis of evolutionary data is used to determine a set of positions likely linked to the recruitment of a promiscuous activity for a new function; a combinatorial library of mutations on this set of positions is prepared and screened for both, the primary and the promiscuous activities; a partial-least-squares reconstruction of the full combinatorial space is carried out; finally, an approximation to the Pareto set of variants with optimal primary/promiscuous activities is derived. Application of the approach to the emergence of folding catalysis in thioredoxin scaffolds reveals an unanticipated scenario: diverse patterns of primary/promiscuous activity modulation are possible, including a moderate (but likely significant in a biological context) simultaneous enhancement of both activities. We show that this scenario can be most simply explained on the basis of the conformational diversity hypothesis, although alternative interpretations cannot be ruled out. Overall, the results reported may help clarify the mechanisms of the evolution of new functions. From a different viewpoint, the partial-least-squares-reconstruction/Pareto-set-prediction approach we have introduced provides the computational basis for an efficient directed-evolution protocol aimed at the simultaneous

  13. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  14. An integral equation approach to the interval reliability of systems modelled by finite semi-Markov processes

    International Nuclear Information System (INIS)

    Csenki, A.

    1995-01-01

    The interval reliability for a repairable system which alternates between working and repair periods is defined as the probability of the system being functional throughout a given time interval. In this paper, a set of integral equations is derived for this dependability measure, under the assumption that the system is modelled by an irreducible finite semi-Markov process. The result is applied to the semi-Markov model of a two-unit system with sequential preventive maintenance. The method used for the numerical solution of the resulting system of integral equations is a two-point trapezoidal rule. The system of implementation is the matrix computation package MATLAB on the Apple Macintosh SE/30. The numerical results are discussed and compared with those from simulation

  15. QA support for TFTR reliability improvement program in preparation for DT operation

    International Nuclear Information System (INIS)

    Parsells, R.F.; Howard, H.P.

    1987-01-01

    As TFTR approaches experiments in the Q=1 regime, machine reliability becomes a major variable in achieving experimental objectives. This paper describes the methods used to quantify current reliability levels, levels required for D-T operations, proposed methods for reliability growth and improvement, and tracking of reliability performance in that growth. Included in this scope are data collection techniques and short comings, bounding current reliability on the upper end, and requirements for D-T operations. Problem characterization through Pareto diagrams provides insight into recurrent failure modes and the use of Duane plots for charting of reliability changes both cumulative and instantaneous, is explained and demonstrated

  16. Heat waves and their significance for a temperate benthic community: A near-natural experimental approach.

    Science.gov (United States)

    Pansch, Christian; Scotti, Marco; Barboza, Francisco R; Al-Janabi, Balsam; Brakel, Janina; Briski, Elizabeta; Bucholz, Björn; Franz, Markus; Ito, Maysa; Paiva, Filipa; Saha, Mahasweta; Sawall, Yvonne; Weinberger, Florian; Wahl, Martin

    2018-04-23

    Climate change will not only shift environmental means but will also increase the intensity of extreme events, exerting additional stress on ecosystems. While field observations on the ecological consequences of heat waves are emerging, experimental evidence is rare, and lacking at the community level. Using a novel "near-natural" outdoor mesocosms approach, this study tested whether marine summer heat waves have detrimental consequences for macrofauna of a temperate coastal community, and whether sequential heat waves provoke an increase or decrease of sensitivity to thermal stress. Three treatments were applied, defined and characterized through a statistical analysis of 15 years of temperature records from the experimental site: (1) no heat wave, (2) two heat waves in June and July followed by a summer heat wave in August and (3) the summer heat wave only. Overall, 50% of the species showed positive, negative or positive/negative responses in either abundance and/or biomass. We highlight four possible ways in which single species responded to either three subsequent heat waves or one summer heat wave: (1) absence of a response (tolerance, 50% of species), (2) negative accumulative effects by three subsequent heat waves (tellinid bivalve), (3) buffering by proceeding heat waves due to acclimation and/or shifts in phenology (spionid polychaete) and (4) an accumulative positive effect by subsequent heat waves (amphipod). The differential responses to single or sequential heat waves at the species level entailed shifts at the community level. Community-level differences between single and triple heat waves were more pronounced than those between regimes with vs. without heat waves. Detritivory was reduced by the single heat wave while suspension feeding was less common in the triple heat wave regime. Critical extreme events occur already today and will occur more frequently in a changing climate, thus, leading to detrimental impacts on coastal marine systems.

  17. Experimental approach to investigate the constrained recovery behavior of coiled monofilament polymer fibers

    Science.gov (United States)

    Mendes, S. S.; Nunes, L. C. S.

    2017-11-01

    The aim of this work is to propose a new approach for investigating the thermo-mechanical behavior of coiled oriented polymer fibers with fixed ends and promote an understanding of the actuation response of coiled polymers in constrained recovery applications. In the proposed experimental methodology, a coiled fiber was pre-stretched by 50% and the distance between its ends remained constant, then it was subjected to a heating-cooling cycle ranging from 30 °C to 120 °C and the induced restoring force was measured. Based on these measurements, axial deformation and shear strain were obtained from full-field displacements extracted by the digital image correlation method from images of the coiled fiber. Three coiled fibers with different initial pitch angles were manufactured, and samples with lengths of 15 mm and 20 mm were tested. Bias angles and coil radius were also estimated using the experimental data associated with the helical spring theory. Results show that significant shape changes can be noticed above the glass transition temperature (47 °C), and these changes induce variation in the resultant forces. The effects of thermal softening and thermal contraction for a modest negative thermal expansion coefficient became evident at temperatures ranging from ∼47 °C to ∼90 °C, while the response of a coiled homochiral polymer fiber was achieved at temperatures close to 90 °C. During the cooling process, saturated states of the axial deformation and shear strain of the coiled fibers were observed at temperatures between 120 °C and 100 °C.

  18. Experimental and numerical study of deposit formation in secondary side SG TSP by electrokinetic approach

    International Nuclear Information System (INIS)

    Guillodo, Michael; Foucault, Marc; Ryckelynck, Natacha; Chahma, Farah; Guingo, Mathieu; Mansour, Carine; Alos-Ramos, Olga; Corredera, Geraldine

    2012-09-01

    Corrosion products deposit formation observed in PWR steam generators (SGs) - related to SG free span fouling and SG clogging - is now reported since several years. SG clogging is a localized phenomenon observed between the leading edge of the Tube Support Plate (TSP) and SG tubing materials. Based on visual inspections, it was found that the gaps between SG tubing material and TSP at the lower part of the broached holes were getting progressively blocked. Therefore, for safe operation, most affected PWRs had to be operated at reduced power. TSP blockage was mainly observed for low-pH water chemistry conditioning, which directly depends on the operating water chemistry. The TSP blockage mechanism is complex due to the localized conditions in which flow pattern change, chemistry and electrochemical conditions are not well understood. Electrokinetic considerations could be pointed out to explain the coupling of chemistry, materials and thermohydraulic (T/H) conditions. In this frame AREVA and EDF have launched a long-term R and D program in order to understand the mechanisms driving the formation of SG clogging. This study based on parametric laboratory tests aims to assess the role of secondary water chemistry, material and T/H conditions on deposit formation. The experimental approach focused on electrokinetic measurements of metallic substrates and on the assessment of oxidation properties of materials in secondary side chemistry. An overall analysis of recent results is presented to address SG deposit formation in secondary water chemistry for various conditioning amines - morpholine, ethanolamine and dimethylamine. To complete the study, the experimental results have been correlated to CFD simulations of particle deposition, by means of stochastic Lagrangian models. These calculations have in particular reproduced correctly the location of the most important particle deposit (the leading edge of the test tube), and have stressed the influence of the

  19. Numerical and experimental approaches to study soil transport and clogging in granular filters

    Science.gov (United States)

    Kanarska, Y.; Smith, J. J.; Ezzedine, S. M.; Lomov, I.; Glascoe, L. G.

    2012-12-01

    Failure of a dam by erosion ranks among the most serious accidents in civil engineering. The best way to prevent internal erosion is using adequate granular filters in the transition areas where important hydraulic gradients can appear. In case of cracking and erosion, if the filter is capable of retaining the eroded particles, the crack will seal and the dam safety will be ensured. Numerical modeling has proved to be a cost-effective tool for improving our understanding of physical processes. Traditionally, the consideration of flow and particle transport in porous media has focused on treating the media as continuum. Practical models typically address flow and transport based on the Darcy's law as a function of a pressure gradient and a medium-dependent permeability parameter. Additional macroscopic constitutes describe porosity, and permeability changes during the migration of a suspension through porous media. However, most of them rely on empirical correlations, which often need to be recalibrated for each application. Grain-scale modeling can be used to gain insight into scale dependence of continuum macroscale parameters. A finite element numerical solution of the Navier-Stokes equations for fluid flow together with Lagrange multiplier technique for solid particles was applied to the simulation of soil filtration in the filter layers of gravity dam. The numerical approach was validated through comparison of numerical simulations with the experimental results of base soil particle clogging in the filter layers performed at ERDC. The numerical simulation correctly predicted flow and pressure decay due to particle clogging. The base soil particle distribution was almost identical to those measured in the laboratory experiment. It is believed that the agreement between simulations and experimental data demonstrates the applicability of the proposed approach for prediction of the soil transport and clogging in embankment dams. To get more precise understanding of

  20. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  1. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  2. Experimental approach and micro-mechanical modeling of the creep behavior of irradiated zirconium alloys

    International Nuclear Information System (INIS)

    Ribis, J.

    2007-12-01

    The fuel rod cladding, strongly affected by microstructural changes due to irradiation such as high density of dislocation loops, is strained by the end-of-life fuel rod internal pressure and the potential release of fission gases and helium during dry storage. Within the temperature range that is expected during dry interim storage, cladding undergoes long term creep under over-pressure. So, in order to have a predictive approach of the behavior of zirconium alloys cladding in dry storage conditions it is essential to take into account: initial dislocation loops, thermal annealing of loops and creep straining due to over pressure. Specific experiments and modelling for irradiated samples have been developed to improve our knowledge in that field. A Zr-1%Nb-O alloy was studied using fine microstructural investigations and mechanical testing. The observations conducted by transmission electron microscopy show that the high density of loops disappears during a heat treatment. The loop size becomes higher and higher while their density falls. The microhardness tests reveal that the fall of loop density leads to the softening of the irradiated material. During a creep test, both temperature and applied stress are responsible of the disappearance of loops. The loops could be swept by the activation of the basal slip system while the prism slip system is inhibited. Once deprived of loops, the creep properties of the irradiated materials are closed to the non irradiated state, a result whose consequence is a sudden acceleration of the creep rate. Finally, a micro-mechanical modeling based on microscopic deformation mechanisms taking into account experimental dislocation loop analyses and creep test, was used for a predictive approach by constructing a deformation mechanism map of the creep behavior of the irradiated material. (author)

  3. The Australian national reactive phosphate rock project - Aims, experimental approach, and site characteristics

    International Nuclear Information System (INIS)

    McLaughlin, M.J.

    2002-01-01

    Field-based cutting trials were established across Australia in a range of environments to evaluate the agronomic effectiveness of 5 phosphate rocks, and 1 partially acidulated phosphate rock, relative to either single super-phosphate or triple superphosphate. The phosphate rocks differed in reactivity, as determined by the degree of carbonate substitution for phosphate in the apatite structure and solubility of phosphorus present in the fertilizers in 2% formic acid, 2% citric acid and neutral ammonium citrate. Sechura (Bayovar) and North Carolina phosphate rocks were highly reactive (>70% solubility in 2% formic acid), whilst Khouribja (Moroccan) and Hamrawein (Egypt) phosphate rock were moderately reactive. Duchess phosphate rock from Queensland was relatively unreactive ( 2 , from 4.0 to 5.1, and Colwell extractable phosphorus ranged from 3 to 47 μg/g prior to fertilizer application. Two core experiments were established at each site. The first measured the effects of phosphate rock reactivity on agronomic effectiveness, while the second core experiment measured the effects of the degree of water solubility of the phosphorus source on agronomic effectiveness. The National Reactive Phosphate Rock Project trials provided the opportunity to confirm the suitability of accepted procedures to model fertilizer response and to develop new approaches for comparing different fertilizer responses. The Project also provided the framework for subsidiary studies such as the effect of fertilizer source on soil phosphorus extractability; cadmium and fluorine concentrations in herbage; evaluation of soil phosphorus tests; and the influence of particle size on phosphate rock effectiveness. The National Reactive Phosphate Rock Project presents a valuable model for a large, Australia-wide, collaborative team approach to an important agricultural issue. The use of standard and consistent experimental methodologies at every site ensured that maximum benefit was obtained from data

  4. Does Improved Water Access Increase Child School Attendance? A Quasi-Experimental Approach From Rural Ethiopia

    Science.gov (United States)

    Masuda, Y.; Cook, J.

    2012-12-01

    not measure the portion of children that engage in both activities. Indeed, children may very well be "attending" school according to an enrollment measure, but they may be doing so at low rates that prevent them from advancing to higher grade levels. Although enrollment rates may remain constant pre- and post-water access, school attendance may increase with the provision of water. This paper overcomes previous limitations by utilizing panel data from a quasi-experimental study and a continuous measure for school attendance collected over one year via random school attendance checks. In total, we collected data on 642 children from randomly selected households. Using a difference-in-difference estimator, our preliminary analysis finds that water access increases school attendance by 6% and is statistically significant at the 5% significance level. When using school enrollment as the outcome variable preliminary analysis finds that water access increases enrollment by 3%, although it is only marginally significant at the 10% significance level. Data on schooling via random school attendance checks provide a more reliable measure for the true impact of water access on schooling, and our preliminary findings suggest that the impact may be higher than previously estimated.

  5. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  6. Prenatal exposure to maternal smoking and childhood behavioural problems: a quasi-experimental approach.

    Science.gov (United States)

    McCrory, Cathal; Layte, Richard

    2012-11-01

    This retrospective cross-sectional paper examines the relationship between maternal smoking during pregnancy and children's behavioural problems at 9 years of age independent of a wide range of possible confounders. The final sample comprised 7,505 nine-year-old school children participating in the first wave of the Growing Up in Ireland study. The children were selected through the Irish national school system using a 2-stage sampling method and were representative of the nine-year population. Information on maternal smoking during pregnancy was obtained retrospectively at 9 years of age via parental recall and children's behavioural problems were assessed using the Strengths and Difficulties Questionnaire across separate parent and teacher-report instruments. A quasi-experimental approach using propensity score matching was used to create treatment (smoking) and control (non-smoking) groups which did not differ significantly in their propensity to smoke in terms of 16 observed characteristics. After matching on the propensity score, children whose mothers smoked during pregnancy were 3.5 % (p parent and teacher-report respectively. Maternal smoking during pregnancy was more strongly associated with externalising than internalising behavioural problems. Analysis of the dose-response relationship showed that the differential between matched treatment and control groups increased with level of maternal smoking. Given that smoking is a modifiable risk factor, the promotion of successful cessation in pregnancy may prevent potentially adverse long-term consequences.

  7. Assessing Neurocognition via Gamified Experimental Logic: A novel approach to simultaneous acquisition of multiple ERPs

    Directory of Open Access Journals (Sweden)

    Ajay Kumar eNair

    2016-01-01

    Full Text Available The present study describes the development of a neurocognitive paradigm: ‘Assessing Neurocognition via Gamified Experimental Logic’ (ANGEL, for performing the parametric evaluation of multiple neurocognitive functions simultaneously. ANGEL employs an audiovisual sensory motor design for the acquisition of multiple event related potentials (ERPs - the C1, P50, MMN, N1, N170, P2, N2pc, LRP, P300 and ERN. The ANGEL paradigm allows assessment of ten neurocognitive variables over the course of three ‘game’ levels of increasing complexity ranging from simple passive observation to complex discrimination and response in the presence of multiple distractors. The paradigm allows assessment of several levels of rapid decision making: speeded up response vs response-inhibition; responses to easy vs difficult tasks; responses based on gestalt perception of clear vs ambiguous stimuli; and finally, responses with set shifting during challenging tasks. The paradigm has been tested using 18 healthy participants from both sexes and the possibilities of varied data analyses have been presented in this paper. The ANGEL approach provides an ecologically valid assessment (as compared to existing tools that quickly yields a very rich dataset and helps to assess multiple ERPs that can be studied extensively to assess cognitive functions in health and disease conditions.

  8. Assessing Neurocognition via Gamified Experimental Logic: A Novel Approach to Simultaneous Acquisition of Multiple ERPs.

    Science.gov (United States)

    Nair, Ajay K; Sasidharan, Arun; John, John P; Mehrotra, Seema; Kutty, Bindu M

    2016-01-01

    The present study describes the development of a neurocognitive paradigm: "Assessing Neurocognition via Gamified Experimental Logic" (ANGEL), for performing the parametric evaluation of multiple neurocognitive functions simultaneously. ANGEL employs an audiovisual sensory motor design for the acquisition of multiple event related potentials (ERPs)-the C1, P50, MMN, N1, N170, P2, N2pc, LRP, P300, and ERN. The ANGEL paradigm allows assessment of 10 neurocognitive variables over the course of three "game" levels of increasing complexity ranging from simple passive observation to complex discrimination and response in the presence of multiple distractors. The paradigm allows assessment of several levels of rapid decision making: speeded up response vs. response-inhibition; responses to easy vs. difficult tasks; responses based on gestalt perception of clear vs. ambiguous stimuli; and finally, responses with set shifting during challenging tasks. The paradigm has been tested using 18 healthy participants from both sexes and the possibilities of varied data analyses have been presented in this paper. The ANGEL approach provides an ecologically valid assessment (as compared to existing tools) that quickly yields a very rich dataset and helps to assess multiple ERPs that can be studied extensively to assess cognitive functions in health and disease conditions.

  9. Surface enhanced Raman spectroscopic studies on aspirin : An experimental and theoretical approach

    Energy Technology Data Exchange (ETDEWEB)

    Premkumar, R.; Premkumar, S.; Parameswari, A.; Mathavan, T.; Benial, A. Milton Franklin, E-mail: miltonfranklin@yahoo.com [Department of Physics, N.M.S.S.V.N College, Madurai-625019, Tamilnadu, India. (India); Rekha, T. N. [PG and Research Department of Physics, Lady Doak College, Madurai-625 002, Tamilnadu, India. (India)

    2016-05-06

    Surface enhanced Raman scattering (SERS) studies on aspirin molecule adsorbed on silver nanoparticles (AgNPs) were investigated by experimental and density functional theory approach. The AgNPs were synthesized by the solution-combustion method and characterized by the X-ray diffraction and high resolution-transmission electron microscopy techniques. The averaged particle size of synthesized AgNPs was calculated as ∼55 nm. The normal Raman spectrum (nRs) and SERS spectrum of the aspirin were recorded. The molecular structure of the aspirin and aspirin adsorbed on silver cluster were optimized by the DFT/ B3PW91 method with LanL2DZ basis set. The vibrational frequencies were calculated and assigned on the basis of potential energy distribution calculation. The calculated nRs and SERS frequencies were correlated well with the observed frequencies. The flat-on orientation was predicted from the nRs and SERS spectra, when the aspirin adsorbed on the AgNPs. Hence, the present studies lead to the understanding of adsorption process of aspirin on the AgNPs, which paves the way for biomedical applications.

  10. Surface enhanced Raman spectroscopic studies on aspirin : An experimental and theoretical approach

    International Nuclear Information System (INIS)

    Premkumar, R.; Premkumar, S.; Parameswari, A.; Mathavan, T.; Benial, A. Milton Franklin; Rekha, T. N.

    2016-01-01

    Surface enhanced Raman scattering (SERS) studies on aspirin molecule adsorbed on silver nanoparticles (AgNPs) were investigated by experimental and density functional theory approach. The AgNPs were synthesized by the solution-combustion method and characterized by the X-ray diffraction and high resolution-transmission electron microscopy techniques. The averaged particle size of synthesized AgNPs was calculated as ∼55 nm. The normal Raman spectrum (nRs) and SERS spectrum of the aspirin were recorded. The molecular structure of the aspirin and aspirin adsorbed on silver cluster were optimized by the DFT/ B3PW91 method with LanL2DZ basis set. The vibrational frequencies were calculated and assigned on the basis of potential energy distribution calculation. The calculated nRs and SERS frequencies were correlated well with the observed frequencies. The flat-on orientation was predicted from the nRs and SERS spectra, when the aspirin adsorbed on the AgNPs. Hence, the present studies lead to the understanding of adsorption process of aspirin on the AgNPs, which paves the way for biomedical applications.

  11. Experimental approaches to identify cellular G-quadruplex structures and functions.

    Science.gov (United States)

    Di Antonio, Marco; Rodriguez, Raphaël; Balasubramanian, Shankar

    2012-05-01

    Guanine-rich nucleic acids can fold into non-canonical DNA secondary structures called G-quadruplexes. The formation of these structures can interfere with the biology that is crucial to sustain cellular homeostases and metabolism via mechanisms that include transcription, translation, splicing, telomere maintenance and DNA recombination. Thus, due to their implication in several biological processes and possible role promoting genomic instability, G-quadruplex forming sequences have emerged as potential therapeutic targets. There has been a growing interest in the development of synthetic molecules and biomolecules for sensing G-quadruplex structures in cellular DNA. In this review, we summarise and discuss recent methods developed for cellular imaging of G-quadruplexes, and the application of experimental genomic approaches to detect G-quadruplexes throughout genomic DNA. In particular, we will discuss the use of engineered small molecules and natural proteins to enable pull-down, ChIP-Seq, ChIP-chip and fluorescence imaging of G-quadruplex structures in cellular DNA. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. The emotional and reconstructive determinants of emotional memories: an experimental approach to flashbulb memory investigation.

    Science.gov (United States)

    Lanciano, Tiziana; Curci, Antonietta; Semin, Gun R

    2010-07-01

    Flashbulb memories (FBMs) are vivid and detailed memories of the reception context of a public emotional event. Brown and Kulik (1977) introduced the label FBM to suggest the idea that individuals are able to preserve knowledge of an event in an indiscriminate way, in analogy with a photograph that preserves all details of a scene. Research work on FBMs has primarily been conducted using a naturalistic approach in order to explore the role of the emotional and reconstructive factors on FBM formation and maintenance. Nevertheless, these studies lack a sufficient control on the factors that might intervene in the process of FBM formation. The contribution of the present studies is addressed to experimentally investigating the role of emotional and reconstructive factors on emotionally charged memories, specifically on FBMs. Paralleling FBM findings, the two studies revealed that simply being in an emotional state allows people to remember all available information, such as irrelevant and unrelated details. Furthermore, the resulting memories are affected by reconstructive processes so that they are not as accurate as their richness of details would suggest.

  13. Numerical and experimental approaches to simulate soil clogging in porous media

    Science.gov (United States)

    Kanarska, Yuliya; LLNL Team

    2012-11-01

    Failure of a dam by erosion ranks among the most serious accidents in civil engineering. The best way to prevent internal erosion is using adequate granular filters in the transition areas where important hydraulic gradients can appear. In case of cracking and erosion, if the filter is capable of retaining the eroded particles, the crack will seal and the dam safety will be ensured. A finite element numerical solution of the Navier-Stokes equations for fluid flow together with Lagrange multiplier technique for solid particles was applied to the simulation of soil filtration. The numerical approach was validated through comparison of numerical simulations with the experimental results of base soil particle clogging in the filter layers performed at ERDC. The numerical simulation correctly predicted flow and pressure decay due to particle clogging. The base soil particle distribution was almost identical to those measured in the laboratory experiment. To get more precise understanding of the soil transport in granular filters we investigated sensitivity of particle clogging mechanisms to various aspects such as particle size ration, the amplitude of hydraulic gradient, particle concentration and contact properties. By averaging the results derived from the grain-scale simulations, we investigated how those factors affect the semi-empirical multiphase model parameters in the large-scale simulation tool. The Department of Homeland Security Science and Technology Directorate provided funding for this research.

  14. Development of the Digestive System-Experimental Challenges and Approaches of Infant Lipid Digestion.

    Science.gov (United States)

    Abrahamse, Evan; Minekus, Mans; van Aken, George A; van de Heijning, Bert; Knol, Jan; Bartke, Nana; Oozeer, Raish; van der Beek, Eline M; Ludwig, Thomas

    2012-12-01

    At least during the first 6 months after birth, the nutrition of infants should ideally consist of human milk which provides 40-60 % of energy from lipids. Beyond energy, human milk also delivers lipids with a specific functionality, such as essential fatty acids (FA), phospholipids, and cholesterol. Healthy development, especially of the nervous and digestive systems, depends fundamentally on these. Epidemiological data suggest that human milk provides unique health benefits during early infancy that extend to long-lasting benefits. Preclinical findings show that qualitative changes in dietary lipids, i.e., lipid structure and FA composition, during early life may contribute to the reported long-term effects. Little is known in this respect about the development of digestive function and the digestion and absorption of lipids by the newborn. This review gives a detailed overview of the distinct functionalities that dietary lipids from human milk and infant formula provide and the profound differences in the physiology and biochemistry of lipid digestion between infants and adults. Fundamental mechanisms of infant lipid digestion can, however, almost exclusively be elucidated in vitro. Experimental approaches and their challenges are reviewed in depth.

  15. A Multi-State Physics Modeling approach for the reliability assessment of Nuclear Power Plants piping systems

    International Nuclear Information System (INIS)

    Di Maio, Francesco; Colli, Davide; Zio, Enrico; Tao, Liu; Tong, Jiejuan

    2015-01-01

    Highlights: • We model piping systems degradation of Nuclear Power Plants under uncertainty. • We use Multi-State Physics Modeling (MSPM) to describe a continuous degradation process. • We propose a Monte Carlo (MC) method for calculating time-dependent transition rates. • We apply MSPM to a piping system undergoing thermal fatigue. - Abstract: A Multi-State Physics Modeling (MSPM) approach is here proposed for degradation modeling and failure probability quantification of Nuclear Power Plants (NPPs) piping systems. This approach integrates multi-state modeling to describe the degradation process by transitions among discrete states (e.g., no damage, micro-crack, flaw, rupture, etc.), with physics modeling by (physic) equations to describe the continuous degradation process within the states. We propose a Monte Carlo (MC) simulation method for the evaluation of the time-dependent transition rates between the states of the MSPM. Accountancy is given for the uncertainty in the parameters and external factors influencing the degradation process. The proposed modeling approach is applied to a benchmark problem of a piping system of a Pressurized Water Reactor (PWR) undergoing thermal fatigue. The results are compared with those obtained by a continuous-time homogeneous Markov Chain Model

  16. Teaching psychomotor skills to beginning nursing students using a web-enhanced approach: a quasi-experimental study.

    Science.gov (United States)

    Salyers, Vincent L

    2007-01-01

    To begin to address the problem of psychomotor skills deficiencies observed in many new graduate nurses, a skills laboratory course was developed using a web-enhanced approach. In this quasi-experimental study, the control group attended weekly lectures, observed skill demonstrations by faculty, practiced skills, and were evaluated on skill performance. The experimental group learned course content using a web-enhanced approach. This allowed students to learn course material outside of class at times convenient for them, thus they had more time during class to perfect psychomotor skills. The experimental group performed better on the final cognitive examination. Students in the traditional sections were more satisfied with the course, however. It was concluded that a web-enhanced approach for teaching psychomotor skills can provide a valid alternative to traditional skills laboratory formats.

  17. Correct approach to consideration of experimental resolution in parametric analysis of scaling violation in deep inelastic lepton-nucleon interaction

    International Nuclear Information System (INIS)

    Ammosov, V.V.; Usubov, Z.U.; Zhigunov, V.P.

    1990-01-01

    A problem of parametric analysis of the scaling violation in deep inelastic lepton-nucleon interactions in the framework of quantum chromodynamics (QCD) is considered. For a correct consideration of the experimental resolution we use the χ 2 -method, which is demonstrated by numeric experiments and analysis of the 15-foot bubble chamber neutrino experimental data. The model parameters obtained in this approach differ noticeably from those obtained earlier. (orig.)

  18. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  19. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  20. A combined experimental and mathematical approach for molecular-based optimization of irinotecan circadian delivery.

    Directory of Open Access Journals (Sweden)

    Annabelle Ballesta

    2011-09-01

    Full Text Available Circadian timing largely modifies efficacy and toxicity of many anticancer drugs. Recent findings suggest that optimal circadian delivery patterns depend on the patient genetic background. We present here a combined experimental and mathematical approach for the design of chronomodulated administration schedules tailored to the patient molecular profile. As a proof of concept we optimized exposure of Caco-2 colon cancer cells to irinotecan (CPT11, a cytotoxic drug approved for the treatment of colorectal cancer. CPT11 was bioactivated into SN38 and its efflux was mediated by ATP-Binding-Cassette (ABC transporters in Caco-2 cells. After cell synchronization with a serum shock defining Circadian Time (CT 0, circadian rhythms with a period of 26 h 50 (SD 63 min were observed in the mRNA expression of clock genes REV-ERBα, PER2, BMAL1, the drug target topoisomerase 1 (TOP1, the activation enzyme carboxylesterase 2 (CES2, the deactivation enzyme UDP-glucuronosyltransferase 1, polypeptide A1 (UGT1A1, and efflux transporters ABCB1, ABCC1, ABCC2 and ABCG2. DNA-bound TOP1 protein amount in presence of CPT11, a marker of the drug PD, also displayed circadian variations. A mathematical model of CPT11 molecular pharmacokinetics-pharmacodynamics (PK-PD was designed and fitted to experimental data. It predicted that CPT11 bioactivation was the main determinant of CPT11 PD circadian rhythm. We then adopted the therapeutics strategy of maximizing efficacy in non-synchronized cells, considered as cancer cells, under a constraint of maximum toxicity in synchronized cells, representing healthy ones. We considered exposure schemes in the form of an initial concentration of CPT11 given at a particular CT, over a duration ranging from 1 to 27 h. For any dose of CPT11, optimal exposure durations varied from 3h40 to 7h10. Optimal schemes started between CT2h10 and CT2h30, a time interval corresponding to 1h30 to 1h50 before the nadir of CPT11 bioactivation rhythm in

  1. Integrated approach for combining sustainability and safety into a RAM analysis, RAM2S (Reliability, Availability, Maintainability, Sustainability and Safety) towards greenhouse gases emission targets

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Tobias V. [Det Norske Veritas (DNV), Hovik, Oslo (Norway)

    2009-07-01

    This paper aims to present an approach to integrate sustainability and safety concerns on top of a typical RAM Analysis to support new enterprises to find alternatives to align themselves to the greenhouse gases emission targets, measured as CO{sub 2} (carbon dioxide) equivalent. This approach can be used to measure the impact of the potential CO{sub 2} equivalent emission levels mainly related to new enterprises with high CO{sub 2} content towards environment and production, as per example, the extraction of oil and gas from the Brazilian Pre-salt layers. In this sense, this integrated approach, combining Sustainability and Safety into a RAM analysis, RAM2S (Reliability, Availability, Maintainability, Sustainability and Safety), can be used to assess the impact of CO{sub 2} 'production' along the entire enterprise life-cycle, including the impact of possible facility shutdown due to emission restrictions limits, as well as due to the occurrence of additional failures modes related to CO{sub 2} corrosion capabilities. Thus, at the end, this integrated approach would allow companies to find out a more cost-effective alternative to adapt their business into the global warming reality, overcoming the inherent threats of greenhouse gases. (author)

  2. Study of the behaviour of trace elements in estuaries: experimental approaches and modeling

    International Nuclear Information System (INIS)

    Dange, Catherine

    2002-01-01

    the biogeochemistry of Cd, Co and Cs in the estuarine environment and the knowledge obtained on the field. Experiments performed both in laboratory and in situ were necessary to check the validity of the assumptions of the model and to evaluate model parameters, which cannot be measured directly like to the sorption properties of natural particles. Radiotracers ("1"0"9Cd, "5"7Co,"1"3"4Cs) were used to determine physico-chemical key processes and environmental variables that control the speciation and the fate of Cd, Co and Cs. This approach, based on the use of spike with various radionuclides, allowed us to evaluate the affinity constants of particles to the four estuaries for the studied metals (global intrinsic complexation and exchange constants) and also the exchangeable particulate fraction estimated from the comparison of measured natural metals coefficients of distribution and coefficient of distribution of their radioactive equivalents. Other parameters, which are necessary to build the model (specific surface area, concentration of active surface sites, mean intrinsic acid-base constants,...), were independently estimated by various experimental approaches, applied in laboratory to particle samples taken throughout estuaries (electrochemical measurements, nitrogen adsorption using the BET method,...). The results of the validation indicate that in spite of its simplifications, the model reproduces in a satisfactory way the dissolved/particulate distributions measured for Cd, Co and Cs. With a predictive aim, this type of model must be coupled with a hydro-sedimentary transport model. (author)

  3. A Reliability Comparison of Classical and Stochastic Thickness Margin Approaches to Address Material Property Uncertainties for the Orion Heat Shield

    Science.gov (United States)

    Sepka, Steve; Vander Kam, Jeremy; McGuire, Kathy

    2018-01-01

    The Orion Thermal Protection System (TPS) margin process uses a root-sum-square approach with branches addressing trajectory, aerothermodynamics, and material response uncertainties in ablator thickness design. The material response branch applies a bond line temperature reduction between the Avcoat ablator and EA9394 adhesive by 60 C (108 F) from its peak allowed value of 260 C (500 F). This process is known as the Bond Line Temperature Material Margin (BTMM) and is intended to cover material property and performance uncertainties. The value of 60 C (108 F) is a constant, applied at any spacecraft body location and for any trajectory. By varying only material properties in a random (monte carlo) manner, the perl-based script mcCHAR is used to investigate the confidence interval provided by the BTMM. In particular, this study will look at various locations on the Orion heat shield forebody for a guided and an abort (ballistic) trajectory.

  4. Reliable computation of roots in analytical waveguide modeling using an interval-Newton approach and algorithmic differentiation.

    Science.gov (United States)

    Bause, Fabian; Walther, Andrea; Rautenberg, Jens; Henning, Bernd

    2013-12-01

    For the modeling and simulation of wave propagation in geometrically simple waveguides such as plates or rods, one may employ the analytical global matrix method. That is, a certain (global) matrix depending on the two parameters wavenumber and frequency is built. Subsequently, one must calculate all parameter pairs within the domain of interest where the global matrix becomes singular. For this purpose, one could compute all roots of the determinant of the global matrix when the two parameters vary in the given intervals. This requirement to calculate all roots is actually the method's most concerning restriction. Previous approaches are based on so-called mode-tracers, which use the physical phenomenon that solutions, i.e., roots of the determinant of the global matrix, appear in a certain pattern, the waveguide modes, to limit the root-finding algorithm's search space with respect to consecutive solutions. In some cases, these reductions of the search space yield only an incomplete set of solutions, because some roots may be missed as a result of uncertain predictions. Therefore, we propose replacement of the mode-tracer approach with a suitable version of an interval- Newton method. To apply this interval-based method, we extended the interval and derivative computation provided by a numerical computing environment such that corresponding information is also available for Bessel functions used in circular models of acoustic waveguides. We present numerical results for two different scenarios. First, a polymeric cylindrical waveguide is simulated, and second, we show simulation results of a one-sided fluid-loaded plate. For both scenarios, we compare results obtained with the proposed interval-Newton algorithm and commercial software.

  5. New modeling and experimental approaches for characterization of two-phase flow interfacial structure

    International Nuclear Information System (INIS)

    Ishii, Mamoru; Sun, Xiaodong

    2004-01-01

    This paper presents new experimental and modeling approaches in characterizing interfacial structures in gas-liquid two-phase flow. For the experiments, two objective approaches are developed to identify flow regimes and to obtain local interfacial structure data. First, a global measurement technique using a non-intrusive ring-type impedance void-meter and a self-organizing neural network is presented to identify the one-dimensional'' flow regimes. In the application of this measurement technique, two methods are discussed, namely, one based on the probability density function of the impedance probe measurement (PDF input method) and the other based on the sorted impedance signals, which is essentially the cumulative probability distribution function of the impedance signals (instantaneous direct signal input method). In the latter method, the identification can be made close to instantaneously since the required signals can be acquired over a very short time period. In addition, a double-sensor conductivity probe can also be used to obtain ''local'' flow regimes by using the instantaneous direct signal input method with the bubble chord length information. Furthermore, a newly designed conductivity probe with multiple double-sensor heads is proposed to obtain ''two-dimensional'' flow regimes across the flow channel. Secondly, a state-of-the-art four-sensor conductivity probe technique has been developed to obtain detailed local interfacial structure information. The four-sensor conductivity probe accommodates the double-sensor probe capability and can be applied in a wide range of flow regimes spanning from bubbly to churn-turbulent flows. The signal processing scheme is developed such that it categorizes the acquired parameters into two groups based on bubble cord length information. Furthermore, for the modeling of the interfacial structure characterization, the interfacial area transport equation proposed earlier has been studied to provide a dynamic and

  6. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry

    2016-11-28

    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  7. Percutaneous transcholecystic approach for an experiment of biliary stent placement: an experimental study in dogs

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Tae Seok [Medical School of Gachon, Inchon (Korea, Republic of); Song, Ho Young; Lim, Jin Oh; Ko, Gi Young; Sung, Kyu Bo; Kim, Tae Hyung; Lee, Ho Jung [College of Medicine, Ulsan Univ., Seoul (Korea, Republic of)

    2002-06-01

    To determine, in an experimental study of biliary stent placement, the usefulness and safety of the percutaneous transcholecystic approach and the patency of a newly designed biliary stent. A stent made of 0.15-mm-thick nitinol wire, and 10 mm in diameter and 2 cm in length, was loaded in an introducer with an 8-F outer diameter. The gallbladders of seven mongrel dogs were punctured with a 16-G angiocath needle under sonographic guidance, and cholangiography was performed. After anchoring the anterior wall of the gallbladder to the abdominal wall using a T-fastener, the gallbladder body was punctured again under fluoroscopic guidance. The cystic and common bile ducts were selected using a 0.035-inch guide wire and a cobra catheter, and the stent was placed in the common bile duct. Post-stenting cholangiography was undertaken, and an 8.5-F drainage tube was inserted in the gallbladder. Two dogs were followed-up and sacrificed at 2,4 and 8 weeks after stent placement, respectively, and the other expired 2 days after stent placement. Follow-up cholangiograms were obtained before aninmal was sacrified, and a pathologic examination was performed. Stent placement was technically successful in all cases. One dog expired 2 days after placement because of bile peritonitis due to migration of the drainage tube into the peritoneal cavity, but the other six remained healthy during the follow-up period. Cholangiography performed before the sacrifice of each dog showed that the stents were patent. Pathologic examination revealed the proliferation of granulation tissue at 2 weeks, and complete endothelialization over the stents by granulation tissue at 8 weeks. Percutaneous transcholecystic biliary stent placement appears to be safe, easy and useful. After placement, the stent was patent during the follow-period.

  8. An experimental approach to free vibration analysis of smart composite beam

    Science.gov (United States)

    Yashavantha Kumar, G. A.; Sathish Kumar, K. M.

    2018-02-01

    Experimental vibration analysis is a main concern of this study. In designing any structural component the important parameter that has to be considered is vibration. The present work involves the experimental investigation of free vibration analysis of a smart beam. Smart beam consists of glass/epoxy composite as a main substrate and two PZT patches. The PZT patches are glued above and below the main beam. By experimentation the natural frequencies and mode shapes are obtained for both with and without PZT patches of a beam. Finally through experimentation the response of the smart beam is recorded.

  9. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  10. An in-situ experimental-numerical approach for interface delamination characterization

    NARCIS (Netherlands)

    Murthy Kolluri, N.V.V.R.

    2011-01-01

    Interfacial delamination is a key reliability challenge in composites and microelectronic systems due to (high density) integration of dissimilar materials. Delamination occurs due to significant stresses generated at the interfaces, for instance, caused by thermal cycling due to the mismatch in

  11. Experimental/Computational Approach to Accommodation Coefficients and its Application to Noble Gases on Aluminum Surface (Preprint)

    Science.gov (United States)

    2009-02-03

    computational approach to accommodation coefficients and its application to noble gases on aluminum surface Nathaniel Selden Uruversity of Southern Cahfornia, Los ...8217 ,. 0.’ a~ .......,..,P. • " ,,-0, "p"’U".. ,Po"D.’ 0.’P.... uro . P." FIG. 5: Experimental and computed radiometri~ force for argon (left), xenon

  12. Open Experimentation on Phenomena of Chemical Reactions via the Learning Company Approach in Early Secondary Chemistry Education

    Science.gov (United States)

    Beck, Katharina; Witteck, Torsten; Eilks, Ingo

    2010-01-01

    Presented is a case study on the implementation of open and inquiry-type experimentation in early German secondary chemistry education. The teaching strategy discussed follows the learning company approach. Originally adopted from vocational education, the learning company method is used to redirect lab-oriented classroom practice towards a more…

  13. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample - A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain.

    Directory of Open Access Journals (Sweden)

    Karolina Sörman

    Full Text Available Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R's reliability, construct validity and factor structure in non-criminal individuals (N = 227 in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i the Psychopathy Checklist: Screening Version (PCL:SV, (ii self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively. Overall, the results demonstrated solid reliability (test-retest and internal consistency and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R.

  14. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Science.gov (United States)

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C

    2014-09-01

    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  15. Ensuring the Reliable Operation of the Power Grid: State-Based and Distributed Approaches to Scheduling Energy and Contingency Reserves

    Science.gov (United States)

    Prada, Jose Fernando

    Keeping a contingency reserve in power systems is necessary to preserve the security of real-time operations. This work studies two different approaches to the optimal allocation of energy and reserves in the day-ahead generation scheduling process. Part I presents a stochastic security-constrained unit commitment model to co-optimize energy and the locational reserves required to respond to a set of uncertain generation contingencies, using a novel state-based formulation. The model is applied in an offer-based electricity market to allocate contingency reserves throughout the power grid, in order to comply with the N-1 security criterion under transmission congestion. The objective is to minimize expected dispatch and reserve costs, together with post contingency corrective redispatch costs, modeling the probability of generation failure and associated post contingency states. The characteristics of the scheduling problem are exploited to formulate a computationally efficient method, consistent with established operational practices. We simulated the distribution of locational contingency reserves on the IEEE RTS96 system and compared the results with the conventional deterministic method. We found that assigning locational spinning reserves can guarantee an N-1 secure dispatch accounting for transmission congestion at a reasonable extra cost. The simulations also showed little value of allocating downward reserves but sizable operating savings from co-optimizing locational nonspinning reserves. Overall, the results indicate the computational tractability of the proposed method. Part II presents a distributed generation scheduling model to optimally allocate energy and spinning reserves among competing generators in a day-ahead market. The model is based on the coordination between individual generators and a market entity. The proposed method uses forecasting, augmented pricing and locational signals to induce efficient commitment of generators based on firm

  16. A Critique on the Effectiveness of Current Human Reliability Analysis Approach for the Human-Machine Interface Design in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee

    2010-01-01

    Human Reliability Analysis (HRA) in cooperation of PSA has been conducted to evaluate the safety of a system and the validity of a system design. HRA has been believed to provide a quantitative value of human error potential and the safety level of a design alternative in Nuclear Power Plants (NPPs). However, it becomes doubtful that current HRA is worth to conduct to evaluate the human factors of NPP design, since there have been many critiques upon the virtue of HRA. Inevitably, the newer the technology becomes, the larger endeavors bound for the new facilitated methods. This paper describes the limitations and the obsolescence of the current HRA, especially for the design evaluation of Human-Machine Interface (HMI) utilizing the recent digital technologies. An alternative approach to the assessment of the human error potential of HMI design is proposed

  17. SuperTRI: A new approach based on branch support analyses of multiple independent data sets for assessing reliability of phylogenetic inferences.

    Science.gov (United States)

    Ropiquet, Anne; Li, Blaise; Hassanin, Alexandre

    2009-09-01

    Supermatrix and supertree are two methods for constructing a phylogenetic tree by using multiple data sets. However, these methods are not a panacea, as conflicting signals between data sets can lead to misinterpret the evolutionary history of taxa. In particular, the supermatrix approach is expected to be misleading if the species-tree signal is not dominant after the combination of the data sets. Moreover, most current supertree methods suffer from two limitations: (i) they ignore or misinterpret secondary (non-dominant) phylogenetic signals of the different data sets; and (ii) the logical basis of node robustness measures is unclear. To overcome these limitations, we propose a new approach, called SuperTRI, which is based on the branch support analyses of the independent data sets, and where the reliability of the nodes is assessed using three measures: the supertree Bootstrap percentage and two other values calculated from the separate analyses: the mean branch support (mean Bootstrap percentage or mean posterior probability) and the reproducibility index. The SuperTRI approach is tested on a data matrix including seven genes for 82 taxa of the family Bovidae (Mammalia, Ruminantia), and the results are compared to those found with the supermatrix approach. The phylogenetic analyses of the supermatrix and independent data sets were done using four methods of tree reconstruction: Bayesian inference, maximum likelihood, and unweighted and weighted maximum parsimony. The results indicate, firstly, that the SuperTRI approach shows less sensitivity to the four phylogenetic methods, secondly, that it is more accurate to interpret the relationships among taxa, and thirdly, that interesting conclusions on introgression and radiation can be drawn from the comparisons between SuperTRI and supermatrix analyses.

  18. Compaction of lanthanide oxide porous microspheres: experimental approach and numerical simulation

    International Nuclear Information System (INIS)

    Parant, Paul

    2016-01-01

    One option envisioned for the future management of high level nuclear waste is the transmutation of minor actinides into short-lived fission products in sodium fast reactor. This route requires the development of pellet fabrication processes to prepare Minor Actinide Bearing Blanket (MABB) for the transmutation of americium. Currently, those ceramic pellets are produced by powder metallurgy processes involving numerous grinding and milling steps that generate very fine and highly contaminating and irradiating particles. a viable option for reducing the amount of those fine particles would be to develop a dustless process by working on much coarser particles. In this context, this study is concerned with the pelletizing of porous and spherical lanthanides oxide precursors (surrogates of actinides). The present work uses both experimental data and numerical simulations to optimize the pelletizing step. The final aim is to obtain, after sintering, homogeneous, dense and undistorted ceramic pellets. Firstly, this study concerns the synthesis and characterization of these oxide microspheres precursors by the Weak acid Resin process, which consists in loading beads of ion exchange resin with lanthanides cations and mineralizing the metal loaded resin leads into sub-millimetric-sized oxide microspheres. Comprehensive characterizations of the microstructure were carried out in function of the synthesis parameters such as calcination temperature, metal nature and diameter size /distribution of the resin beads starting materials to better understand their behaviour into the matrix when producing pellets. Secondly, the mechanical properties of a single microsphere were investigated in order to better understand its behaviour during compaction steps. They were also analysed using multi-scale simulations based on the Discrete Element Method (DEM), which is well suited for such particulate materials. In a second approach, compaction studies were carried out in a three parts die

  19. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    International Nuclear Information System (INIS)

    Devau, Nicolas; Cadre, Edith Le; Hinsinger, Philippe; Jaillard, Benoit; Gerard, Frederic

    2009-01-01

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl 2 (P-CaCl 2 ) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH ∼ 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R 2 = 0.9, RMSE = 0.03 mg kg -1 ). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl 2 with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional contribution of soil organic matter.

  20. Experimental Systems-Biology Approaches for Clostridia-Based Bioenergy Production

    Energy Technology Data Exchange (ETDEWEB)

    Papoutsakis, Elefterios [Univ. of Delaware, Newark, DE (United States)

    2015-04-30

    This is the final project report for project "Experimental Systems-Biology Approaches for Clostridia-Based Bioenergy Production" for the funding period of 9/1/12 to 2/28/2015 (three years with a 6-month no-cost extension) OVERVIEW AND PROJECT GOALS The bottleneck of achieving higher rates and titers of toxic metabolites (such as solvents and carboxylic acids that can used as biofuels or biofuel precursors) can be overcome by engineering the stress response system. Thus, understanding and modeling the response of cells to toxic metabolites is a problem of great fundamental and practical significance. In this project, our goal is to dissect at the molecular systems level and build models (conceptual and quantitative) for the stress response of C. acetobutylicum (Cac) to its two toxic metabolites: butanol (BuOH) and butyrate (BA). Transcriptional (RNAseq and microarray based), proteomic and fluxomic data and their analysis are key requirements for this goal. Transcriptional data from mid-exponential cultures of Cac under 4 different levels of BuOH and BA stress was obtained using both microarrays (Papoutsakis group) and deep sequencing (RNAseq; Meyers and Papoutsakis groups). These two sets of data do not only serve to validate each other, but are also used for identification of stress-induced changes in transcript levels, small regulatory RNAs, & in transcriptional start sites. Quantitative proteomic data (Lee group), collected using the iTRAQ technology, are essential for understanding of protein levels and turnover under stress and the various protein-protein interactions that orchestrate the stress response. Metabolic flux changes (Antoniewicz group) of core pathways, which provide important information on the re-allocation of energy and carbon resources under metabolite stress, were examined using 13C-labelled chemicals. Omics data are integrated at different levels and scales. At the metabolic-pathway level, omics data are integrated into a 2nd generation genome

  1. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Devau, Nicolas [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Cadre, Edith Le [Supagro, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Hinsinger, Philippe; Jaillard, Benoit [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Gerard, Frederic, E-mail: gerard@supagro.inra.fr [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France)

    2009-11-15

    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl{sub 2} (P-CaCl{sub 2}) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH {approx} 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R{sup 2} = 0.9, RMSE = 0.03 mg kg{sup -1}). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl{sub 2} with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional

  2. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  3. Gas transport in low-permeability formations: a review of experimental evidence and modeling approaches

    International Nuclear Information System (INIS)

    Marschall, Paul; Keller, Lukas; Lanyon, Bill; Senger, Rainer

    2012-01-01

    , fragmentation and coalescence of the non-wetting fluid. The degree of complexity increases, when two-phase flow processes occur in a deformable porous medium. Basic experimental research has been conducted by Johnsen et al. (2008) and Kong (2010) on air injections in water saturated granular material in which grain motion can take place. New insight was gained on the dynamic processes associated with the mobilisation of the solid phase. In this context, Kong distinguishes between three successive dynamic processes of coupled fluid-gas-grain flows: pore-scale tree-like invasion, finger-scale multi-channelized flow and finger-scale single channelized migration. A dimensionless quantity, the so-called grain mobilization number is defined to discriminate the different flow regimes. Modelling Approaches. The modelling concepts reported in the scientific literature for the simulation of two-phase flow processes in porous media are underlined by two different fundamental approaches: (1) discrete pore network models represent the porous medium as a network of connected channels. At a given location, a channel is occupied either by the wetting or by the non-wetting fluid. Flow in the channels occurs by piston-like displacement as a result of the pressure difference between the phases and the fluid displacement is simulated by simplified invasion percolation (IP) algorithms. Frequently, stochastic approaches are adopted to describe the geostatistical properties of the porous medium on the pore scale. Such stochastic discrete network models have been applied successfully for the simulation of two-phase flow processes in the regime of capillary fingering. The main challenge is the realistic geostatistical description of the pore network in a stochastic framework, which requires comprehensive microstructural databases for the model identification and conditioning process. (2) Equivalent porous medium models are based on the classical mixing theory of continuum mechanics of fluids. In this

  4. Optimization of the representativeness and transposition approach, for the neutronic design of experimental programs in critical mock-up

    International Nuclear Information System (INIS)

    Dos-Santos, N.

    2013-01-01

    The work performed during this thesis focused on uncertainty propagation (nuclear data, technological uncertainties, calculation biases,...) on integral parameters, and the development of a novel approach enabling to reduce this uncertainty a priori directly from the design phase of a new experimental program. This approach is based on a multi-parameter multi-criteria extension of representativeness and transposition theories. The first part of this PhD work covers an optimization study of sensitivity and uncertainty calculation schemes to different modeling scales (cell, assembly and whole core) for LWRs and FBRs. A degraded scheme, based on standard and generalized perturbation theories, has been validated for the calculation of uncertainty propagation to various integral quantities of interest. It demonstrated the good a posteriori representativeness of the EPICURE experiment for the validation of mixed UOX-MOX loadings, as the importance of some nuclear data in the power tilt phenomenon in large LWR cores. The second part of this work was devoted to methods and tools development for the optimized design of experimental programs in ZPRs. Those methods are based on multi-parameters representativeness using simultaneously various quantities of interest. Finally, an original study has been conducted on the rigorous estimation of correlations between experimental programs in the transposition process. The coupling of experimental correlations and multi-parametric representativeness approach enables to efficiently design new programs, able to answer additional qualification requirements on calculation tools. (author) [fr

  5. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  6. Consistency from the perspective of an experimental systems approach to the sciences and their epistemic objects

    Directory of Open Access Journals (Sweden)

    Hans-Jörg Rheinberger

    2011-06-01

    Full Text Available It is generally accepted that the development of the modern sciences is rooted in experiment. Yet for a long time, experimentation did not occupy a prominent role, neither in philosophy nor in history of science. With the 'practical turn' in studying the sciences and their history, this has begun to change. This paper is concerned with systems and cultures of experimentation and the consistencies that are generated within such systems and cultures. The first part of the paper exposes the forms of historical and structural coherence that characterize the experimental exploration of epistemic objects. In the second part, a particular experimental culture in the life sciences is briefly described as an example. A survey will be given of what it means and what it takes to analyze biological functions in the test tube.

  7. Opening the Implicit Leadership Theories’ Black Box: An Experimental Approach with Conjoint Analysis

    OpenAIRE

    Gustavo M. Tavares; Filipe Sobral; Rafael Goldszmidt; Felipe Araújo

    2018-01-01

    Although research on implicit leadership theories (ILTs) has concentrated on determining which attributes define a leadership prototype, little attention has been paid to testing the relative importance of each of these attributes for individuals’ leadership perceptions. Building on socio-cognitive theories of impression processes, we experimentally explore the formation of leadership perceptions based on the recognition of six key attributes in a series of three experimental studies comprisi...

  8. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  9. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    Science.gov (United States)

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  10. A Combined Experimental and Computational Approach to Subject-Specific Analysis of Knee Joint Laxity

    Science.gov (United States)

    Harris, Michael D.; Cyr, Adam J.; Ali, Azhar A.; Fitzpatrick, Clare K.; Rullkoetter, Paul J.; Maletsky, Lorin P.; Shelburne, Kevin B.

    2016-01-01

    Modeling complex knee biomechanics is a continual challenge, which has resulted in many models of varying levels of quality, complexity, and validation. Beyond modeling healthy knees, accurately mimicking pathologic knee mechanics, such as after cruciate rupture or meniscectomy, is difficult. Experimental tests of knee laxity can provide important information about ligament engagement and overall contributions to knee stability for development of subject-specific models to accurately simulate knee motion and loading. Our objective was to provide combined experimental tests and finite-element (FE) models of natural knee laxity that are subject-specific, have one-to-one experiment to model calibration, simulate ligament engagement in agreement with literature, and are adaptable for a variety of biomechanical investigations (e.g., cartilage contact, ligament strain, in vivo kinematics). Calibration involved perturbing ligament stiffness, initial ligament strain, and attachment location until model-predicted kinematics and ligament engagement matched experimental reports. Errors between model-predicted and experimental kinematics averaged ligaments agreed with literature descriptions. These results demonstrate the ability of our constraint models to be customized for multiple individuals and simultaneously call attention to the need to verify that ligament engagement is in good general agreement with literature. To facilitate further investigations of subject-specific or population based knee joint biomechanics, data collected during the experimental and modeling phases of this study are available for download by the research community. PMID:27306137

  11. Linking soil chemistry, treeline shifts and climate change: scenario modeling using an experimental approach

    Science.gov (United States)

    Mavris, Christian; Furrer, Gerhard; Anderson, Susanne; Blum, Alex; Wells, Aaron; Dahms, Dennis; Egli, Markus

    2014-05-01

    Climate change and global warming have a strong influence on the landscape development. As cold areas become warmer, both flora and fauna must adapt to new conditions (a). It is widely accepted that climate changes deeply influence the treeline shifts. In addition to that, wildfires, plant diseases and insect infestation (i.e. mountain pine beetle) can promote a selective replacement of plants, inhibiting some and favoring others, thus modifying the ecosystem in diverse ways. There is little knowledge on the behavior of soil chemistry when such changes occur. Will elemental availability become a crucial factor as a function of climate changes? The Sinks Canyon and Stough Basin - SE flank of the Wind River Range, Wyoming, USA - offer an ideal case study. Conceptually, the areas were divided into three main subsets: tundra, forest and a subarid environment. All soils were developed on granitoid moraines (b, c). From each subset, a liquid topsoil extract was produced and mixed with the solid subsoil samples in batch reactors at 50 °C. The batch experiments were carried out over 1800 h, and the progress of the dissolution was regularly monitored by analyzing liquid aliquots using IC and ICP-OES. The nutrients were mostly released within the first hours of the experiment. Silicon and Al were continuously released into the solution, while some alkali elements - i.e. Na - showed a more complex trend. Organic acids (acetic, citric) and other ligands produced during biodegradation played an active role in mineral dissolution and nutrient release. The mineral colloids detected in the extract (X-ray diffraction) can significantly control surface reactions (adsorption/desorption) and contributed to specific cationic concentrations. The experimental set up was then compared to a computed dissolution model using SerialSTEADYQL software (d, e). Decoding the mechanisms driving mineral weathering is the key to understand the main geochemical aspects of adaptation during climate

  12. Comparing Different Approaches to Visualizing Light Waves: An Experimental Study on Teaching Wave Optics

    Science.gov (United States)

    Mešic, Vanes; Hajder, Erna; Neumann, Knut; Erceg, Nataša

    2016-01-01

    Research has shown that students have tremendous difficulties developing a qualitative understanding of wave optics, at all educational levels. In this study, we investigate how three different approaches to visualizing light waves affect students' understanding of wave optics. In the first, the conventional, approach light waves are represented…

  13. Theoretical and experimental work on steam generator integrity and reliability with particular reference to leak development and detection. United Kingdom status report. October 1983

    International Nuclear Information System (INIS)

    Smedley, J.A.; Edge, D.M.

    1984-01-01

    This paper reviews the experimental and theoretical work in the UK on the characteristics of sodium-water reactions and describes work on the development of leak detection systems. A review of the operating experience with the PFR steam generators and the protection philosophy used on PFR is also given and the design studies for the Commercial Demonstration Fast Reactor (CDFR) are described

  14. An experimental approach to angular momentum transfer in heavy ion reactions

    International Nuclear Information System (INIS)

    Babinet, R.

    1980-01-01

    The current experimental status on angular momentum transfer status in heavy ion reactions is reviewed. After a short presentation of the basic theoretical concepts that are underlying all the research works in this field, the experimental techniques that have been commonly used are presented. Results obtained by the γ-multiplicity method are discussed first. Then come, for the very heavy systems, the sequential fission data, followed by the results of a recent experiment on light charged particles. The simple theoretical concepts that are introduced first are continuously used as guidelines to discuss the following results. The respective advantages but also the basic limitations of the above three experimental techniques are exposed. Although they are expected to work best in different regions of the mass table, it is shown, that they give complementary informations which have been most useful in improving our understanding of the tangential friction mechanism

  15. Validation by theoretical approach to the experimental estimation of efficiency for gamma spectrometry of gas in 100 ml standard flask

    International Nuclear Information System (INIS)

    Mohan, V.; Chudalayandi, K.; Sundaram, M.; Krishnamony, S.

    1996-01-01

    Estimation of gaseous activity forms an important component of air monitoring at Madras Atomic Power Station (MAPS). The gases of importance are argon 41 an air activation product and fission product noble gas xenon 133. For estimating the concentration, the experimental method is used in which a grab sample is collected in a 100 ml volumetric standard flask. The activity of gas is then computed by gamma spectrometry using a predetermined efficiency estimated experimentally. An attempt is made using theoretical approach to validate the experimental method of efficiency estimation. Two analytical models named relative flux model and absolute activity model were developed independently of each other. Attention is focussed on the efficiencies for 41 Ar and 133 Xe. Results show that the present method of sampling and analysis using 100 ml volumetric flask is adequate and acceptable. (author). 5 refs., 2 tabs

  16. A realistic approach to modeling an in-duct desulfurization process based on an experimental pilot plant study

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, F.J.G.; Ollero, P. [University of Seville, Seville (Spain)

    2008-07-15

    This paper has been written to provide a realistic approach to modeling an in-duct desulfurization process and because of the disagreement between the results predicted by published kinetic models of the reaction between hydrated lime and SO{sub 2} at low temperature and the experimental results obtained in pilot plants where this process takes place. Results were obtained from an experimental program carried out in a 3-MWe pilot plant. Additionally, five kinetic models, from the literature, of the reaction of sulfation of Ca(OH){sub 2} at low temperatures were assessed by simulation and indicate that the desulfurization efficiencies predicted by them are clearly lower than those experimentally obtained in our own pilot plant as well as others. Next, a general model was fitted by minimizing the difference between the calculated and the experimental results from the pilot plant, using Matlab{sup TM}. The parameters were reduced as much as possible, to only two. Finally, after implementing this model in a simulation tool of the in-duct sorbent injection process, it was validated and it was shown to yield a realistic approach useful for both analyzing results and aiding in the design of an in-duct desulfurization process.

  17. Determination of pKa and the corresponding structures of quinclorac using combined experimental and theoretical approaches

    Science.gov (United States)

    Song, Dean; Sun, Huiqing; Jiang, Xiaohua; Kong, Fanyu; Qiang, Zhimin; Zhang, Aiqian; Liu, Huijuan; Qu, Jiuhui

    2018-01-01

    As an emerging environmental contaminant, the herbicide quinclorac has attracted much attention in recent years. However, a very fundamental issue, the acid dissociation of quinclorac has not yet to be studied in detail. Herein, the pKa value and the corresponding structures of quinclorac were systematically investigated using combined experimental and theoretical approaches. The experimental pKa of quinclorac was determined by the spectrophotometric method to be 2.65 at 25 °C with ionic strength of 0.05 M, and was corrected to be 2.56 at ionic strength of zero. The molecular structures of quinclorac were then located by employing the DFT calculation. The anionic quinclorac was directly located with the carboxylic group perpendicular to the aromatic ring, while neutral quinclorac was found to be the equivalent twin structures. The result was further confirmed by analyzing the UV/Vis and MS-MS2 spectra from both experimental and theoretical viewpoints. By employing the QSPR approach, the theoretical pKa of QCR was determined to be 2.50, which is excellent agreement with the experimental result obtained herein. The protonation of QCR at the carboxylic group instead of the quinoline structure was attributed to the weak electronegative property of nitrogen atom induced by the electron-withdrawing groups. It is anticipated that this work could not only help in gaining a deep insight into the acid dissociation of quinclorac but also offering the key information on its reaction and interaction with others.

  18. Replicating Experimental Impact Estimates Using a Regression Discontinuity Approach. NCEE 2012-4025

    Science.gov (United States)

    Gleason, Philip M.; Resch, Alexandra M.; Berk, Jillian A.

    2012-01-01

    This NCEE Technical Methods Paper compares the estimated impacts of an educational intervention using experimental and regression discontinuity (RD) study designs. The analysis used data from two large-scale randomized controlled trials--the Education Technology Evaluation and the Teach for America Study--to provide evidence on the performance of…

  19. An engineering approach to business model experimentation – an online investment research startup case study

    NARCIS (Netherlands)

    Kijl, Björn; Boersma, Durk

    2010-01-01

    Every organization needs a viable business model. Strikingly, most of current literature is focused on business model design, whereas there is almost no attention for business model validation and implementation and related business model experimentation. The goal of the research as described in

  20. Immunosuppression for in vivo research: state-of-the-art protocols and experimental approaches

    Institute of Scientific and Technical Information of China (English)

    Rita Diehl; Fabienne Ferrara; Claudia Müller; Antje Y Dreyer; Damian D McLeod; Stephan Fricke; Johannes Boltze

    2017-01-01

    Almost every experimental treatment strategy using non-autologous cell,tissue or organ transplantation is tested in small and large animal models before clinical translation.Because these strategies require immunosuppression in most cases,immunosuppressive protocols are a key element in transplantation experiments.However,standard immunosuppressive protocols are often applied without detailed knowledge regarding their efficacy within the particular experimental setting and in the chosen model species.Optimization of such protocols is pertinent to the translation of experimental results to human patients and thus warrants further investigation.This review summarizes current knowledge regarding immunosuppressive drug classes as well as their dosages and application regimens with consideration of species-specific drug metabolization and side effects.It also summarizes contemporary knowledge of novel immunomodulatory strategies,such as the use of mesenchymal stem cells or antibodies.Thus,this review is intended to serve as a state-of-the-art compendium for researchers to refine applied experimental immunosuppression and immunomodulation strategies to enhance the predictive value of preclinical transplantation studies.

  1. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  2. Identification and induction of human, social, and cultural capitals through an experimental approach to stormwater management

    Science.gov (United States)

    Decentralized stormwater management is based on the dispersal of stormwater management practices (SWMP) throughout a watershed to manage stormwater runoff volume and potentially restore natural hydrologic processes. This approach to stormwater management is increasingly popular b...

  3. The combined theoretical and experimental approach to arrive at optimum parameters in friction stir welding

    Science.gov (United States)

    Jagadeesha, C. B.

    2017-12-01

    Even though friction stir welding was invented long back (1991) by TWI England, till now there has no method or procedure or approach developed, which helps to obtain quickly optimum or exact parameters yielding good or sound weld. An approach has developed in which an equation has been derived, by which approximate rpm can be obtained and by setting range of rpm ±100 or 50 rpm over approximate rpm and by setting welding speed equal to 60 mm/min or 50 mm/min one can conduct FSW experiment to reach optimum parameters; one can reach quickly to optimum parameters, i.e. desired rpm, and welding speed, which yield sound weld by the approach. This approach can be effectively used to obtain sound welds for all similar and dissimilar combinations of materials such as Steel, Al, Mg, Ti, etc.

  4. Improvement of the impedance measurement reliability by some new experimental and data treatment procedures applied to the behavior of copper in neutral chloride solutions containing small heterocycle molecules

    International Nuclear Information System (INIS)

    Blajiev, O.L.; Breugelmans, T.; Pintelon, R.; Hubin, A.

    2006-01-01

    The electrochemical behavior of copper in chloride solutions containing 0.001 M concentrations of small five- and six-ring member heterocyclic molecules was investigated by means of impedance spectroscopy. The investigation was performed by a new technique based on a broadband multisine excitation. This method allows for a quantification and separation of the measurement and stohastic nonlinear noises and for an estimation of the bias non-linear contribution. It as well reduces the perturbation brought to studied system by the measurement process itself. The measurement data for some experimental conditions was quantified by fitting into a equivalent circuit corresponding to a physical model both of them developed earlier. In general, the experimental results obtained show that the number of atoms in the heterocyclic ring and the molecular conformation have a significant influence on the electrochemical response of copper in the investigated environments

  5. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  6. A statistical approach to the experimental design of the sulfuric acid leaching of gold-copper ore

    Directory of Open Access Journals (Sweden)

    Mendes F.D.

    2003-01-01

    Full Text Available The high grade of copper in the Igarapé Bahia (Brazil gold-copper ore prevents the direct application of the classic cyanidation process. Copper oxides and sulfides react with cyanides in solution, causing a high consumption of leach reagent and thereby raising processing costs and decreasing recovery of gold. Studies have showm that a feasible route for this ore would be a pretreatment for copper minerals removal prior to the cyanidation stage. The goal of this experimental work was to study the experimental conditions required for copper removal from Igarapé Bahia gold-copper ore by sulfuric acid leaching by applying a statistical approach to the experimental design. By using the Plackett Burman method, it was possible to select the variables that had the largest influence on the percentage of copper extracted at the sulfuric acid leaching stage. These were temperature of leach solution, stirring speed, concentration of sulfuric acid in the leach solution and particle size of the ore. The influence of the individual effects of these variables and their interactions on the experimental response were analyzed by applying the replicated full factorial design method. Finally, the selected variables were optimized by the ascending path statistical method, which determined the best experimental conditions for leaching to achieve the highest percentage of copper extracted. Using the optimized conditions, the best leaching results showed a copper extraction of 75.5%.

  7. Frequency response function of motors for switching noise energy with a new experimental approach

    International Nuclear Information System (INIS)

    Kim, Hyunsu; Yoon, Jong-Yun

    2017-01-01

    Switching energy in electrical vehicles can create serious noise from the motors. However, the characteristics of switching noise in vehicle motors are not clear due to the complexity of measuring them. This study proposes a new experimental method to investigate the switching noise energy of a vehicle motor based on frequency response functions. A function generator-amplifier system is used to gen- erate the switching energy instead of the complex battery-inverter system that has previously been used to examine the noise energy characteristics. Even though newly adapted experimental method is simple, the switching noise energy was explicitly investigated under various input signals. Thus, this simple new method can be used to investigate the dynamic characteristics of noise energy in a vehicle motor

  8. Impacts of radiation exposure on the experimental microbial ecosystem: a particle-based model simulation approach

    International Nuclear Information System (INIS)

    Doi, M.; Tanaka, N.; Fuma, S.; Kawabata, Z.

    2004-01-01

    Well-designed experimental model ecosystem could be a simple reference of the actual environment and complex ecological systems. For ecological toxicity test of radiation and other environmental toxicants, we investigated and aquatic microbial ecosystem (closed microcosm) in the test tube with initial substrates,autotroph flagellate algae (Euglena, G.), heterotroph ciliate protozoa (Tetrahymena T.) and saprotroph bacteria (E, coli). These species organizes by itself to construct the ecological system, that keeps the sustainable population dynamics for more than 2 years after inoculation only by adding light diurnally and controlling temperature at 25 degree Celsius. Objective of the study is to develop the particle-based computer simulation by reviewing interactions among microbes and environment, and analyze the ecological toxicities of radiation on the microcosm by replicating experimental results in the computer simulation. (Author) 14 refs

  9. Exploring SiSn as a performance enhancing semiconductor: A theoretical and experimental approach

    KAUST Repository

    Hussain, Aftab M.

    2014-12-14

    We present a novel semiconducting alloy, silicon-tin (SiSn), as channel material for complementary metal oxide semiconductor (CMOS) circuit applications. The material has been studied theoretically using first principles analysis as well as experimentally by fabricating MOSFETs. Our study suggests that the alloy offers interesting possibilities in the realm of silicon band gap tuning. We have explored diffusion of tin (Sn) into the industry\\'s most widely used substrate, silicon (100), as it is the most cost effective, scalable and CMOS compatible way of obtaining SiSn. Our theoretical model predicts a higher mobility for p-channel SiSn MOSFETs, due to a lower effective mass of the holes, which has been experimentally validated using the fabricated MOSFETs. We report an increase of 13.6% in the average field effect hole mobility for SiSn devices compared to silicon control devices.

  10. Cohesive Laws and Progressive Damage Analysis of Composite Bonded Joints, a Combined Numerical/Experimental Approach

    Science.gov (United States)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2015-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations, in agreement with experimental tests, indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  11. Experimental studies of low salinity water flooding in carbonate reservoirs: A new promising approach

    DEFF Research Database (Denmark)

    Zahid, Adeel; Shapiro, Alexander; Skauge, Arne

    2012-01-01

    Low salinity water flooding is well studied for sandstone reservoirs, both laboratory and field tests have showed improvement in the oil recovery in many cases. Up to very recently, the low salinity effect has been indeterminated for carbonates. Most recently, Saudi Aramco reported that substantial...... additional oil recovery can be achieved when successively flooding composite carbonate core plugs with various diluted versions of seawater. The experimental data on carbonates is very limited, so more data and better understanding of the mechanisms involved is needed to utilize this method for carbonate...... reservoirs. In this paper, we have experimentally investigated the oil recovery potential of low salinity water flooding for carbonate rocks. We used both reservoir carbonate and outcrop chalk core plugs. The flooding experiments were carried out initially with the seawater, and afterwards additional oil...

  12. Frequency response function of motors for switching noise energy with a new experimental approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyunsu [Ensemble Center for Automotive Research, Seoul (Korea, Republic of); Yoon, Jong-Yun [Incheon National University, Incheon (Korea, Republic of)

    2017-06-15

    Switching energy in electrical vehicles can create serious noise from the motors. However, the characteristics of switching noise in vehicle motors are not clear due to the complexity of measuring them. This study proposes a new experimental method to investigate the switching noise energy of a vehicle motor based on frequency response functions. A function generator-amplifier system is used to gen- erate the switching energy instead of the complex battery-inverter system that has previously been used to examine the noise energy characteristics. Even though newly adapted experimental method is simple, the switching noise energy was explicitly investigated under various input signals. Thus, this simple new method can be used to investigate the dynamic characteristics of noise energy in a vehicle motor.

  13. Long-term behaviour of concrete in water saturated media - Experimental and modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Peycelon, H; Mazoin, C

    2004-07-01

    In the context of the nuclear long-lived radioactive waste management, cement-based materials are currently used for waste encapsulation and containers development. Such materials are also likely to be used for engineered barriers in deep repositories. Various types of cement - CEM I, CEM V - have been currently studied, mainly to evaluate materials long-term durability. Studies have been performed on the leaching behavior of hardened cement pastes based on these cements. The effect of temperature is taking into account. Leaching experiments for 25 deg C, 50 deg C and 85 deg C were carried out with a standard test developed at CEA. Experimental results were analyzed and calculations were made to estimate calcium fluxes and degraded thicknesses. Experimental and modelling results were compared. (authors)

  14. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    Science.gov (United States)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  15. PredPsych: A toolbox for predictive machine learning based approach in experimental psychology research

    OpenAIRE

    Cavallo, Andrea; Becchio, Cristina; Koul, Atesh

    2016-01-01

    Recent years have seen an increased interest in machine learning based predictive methods for analysing quantitative behavioural data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible software framework. The goal of this work was to build an open-source toolbox – “PredPsych” – that could make these methods readily available to all psychologists. PredPsych is a...

  16. Experimental estimation of mutation rates in a wheat population with a gene genealogy approach.

    Science.gov (United States)

    Raquin, Anne-Laure; Depaulis, Frantz; Lambert, Amaury; Galic, Nathalie; Brabant, Philippe; Goldringer, Isabelle

    2008-08-01

    Microsatellite markers are extensively used to evaluate genetic diversity in natural or experimental evolving populations. Their high degree of polymorphism reflects their high mutation rates. Estimates of the mutation rates are therefore necessary when characterizing diversity in populations. As a complement to the classical experimental designs, we propose to use experimental populations, where the initial state is entirely known and some intermediate states have been thoroughly surveyed, thus providing a short timescale estimation together with a large number of cumulated meioses. In this article, we derived four original gene genealogy-based methods to assess mutation rates with limited bias due to relevant model assumptions incorporating the initial state, the number of new alleles, and the genetic effective population size. We studied the evolution of genetic diversity at 21 microsatellite markers, after 15 generations in an experimental wheat population. Compared to the parents, 23 new alleles were found in generation 15 at 9 of the 21 loci studied. We provide evidence that they arose by mutation. Corresponding estimates of the mutation rates ranged from 0 to 4.97 x 10(-3) per generation (i.e., year). Sequences of several alleles revealed that length polymorphism was only due to variation in the core of the microsatellite. Among different microsatellite characteristics, both the motif repeat number and an independent estimation of the Nei diversity were correlated with the novel diversity. Despite a reduced genetic effective size, global diversity at microsatellite markers increased in this population, suggesting that microsatellite diversity should be used with caution as an indicator in biodiversity conservation issues.

  17. An Integrated Computational and Experimental Approach Toward the Design of Materials for Fuel Cell Systems

    Science.gov (United States)

    2012-10-01

    13 Based on the limited work done, the best reported ORR chalcogenide electrocatalysts for PEMFC applications can be ranked as follows: MoRuSe... PEMFC catalysts is the durability of the catalyst particles. Particle size distribution tends to shift towards larger particles during the...the design of new materials for applications in PEMFCs . Reference: A more detailed treatment of the topics of this section, Experimental Target 11

  18. Do Ethnic Enclaves Impede Immigrants’ Integration? Evidence from a Quasi-Experimental Social-Interaction Approach

    OpenAIRE

    Danzer, A. M.; Yaman, F.

    2013-01-01

    It is widely debated whether immigrants who live among co-ethnics are less willing to integrate into the host society. Exploiting the quasi-experimental guest worker placement across German regions during the 1960/70s as well as information on immigrants’ inter-ethnic contact networks and social activities, we are able to identify the causal effect of ethnic concentration on social integration. The exogenous placement of immigrants "switches off" observable and unobservable differences in t...

  19. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  20. Studies of the tautomeric equilibrium of 1,3-thiazolidine-2-thione: Theoretical and experimental approaches

    Energy Technology Data Exchange (ETDEWEB)

    Abbehausen, Camilla; Paiva, Raphael E.F. de [Institute of Chemistry, University of Campinas - UNICAMP, P.O. Box 6154, 13083-970 Campinas, SP (Brazil); Formiga, Andre L.B., E-mail: formiga@iqm.unicamp.br [Institute of Chemistry, University of Campinas - UNICAMP, P.O. Box 6154, 13083-970 Campinas, SP (Brazil); Corbi, Pedro P. [Institute of Chemistry, University of Campinas - UNICAMP, P.O. Box 6154, 13083-970 Campinas, SP (Brazil)

    2012-10-26

    Highlights: Black-Right-Pointing-Pointer Tautomeric equilibrium in solution. Black-Right-Pointing-Pointer Spectroscopic and theoretical studies. Black-Right-Pointing-Pointer UV-Vis theoretical and experimental spectra. Black-Right-Pointing-Pointer {sup 1}H NMR theoretical and experimental spectra. -- Abstract: The tautomeric equilibrium of the thione/thiol forms of 1,3-thiazolidine-2-thione was studied by nuclear magnetic resonance, infrared and ultraviolet-visible spectroscopies. Density functional theory was used to support the experimental data and indicates the predominance of the thione tautomer in the solid state, being in agreement with previously reported crystallographic data. In solution, the tautomeric equilibrium was evaluated using {sup 1}H NMR at different temperatures in four deuterated solvents acetonitrile, dimethylsulfoxide, chloroform and methanol. The equilibrium constants, K = (thiol)/(thione), and free Gibbs energies were obtained by integration of N bonded hydrogen signals at each temperature for each solvent, excluding methanol. The endothermic tautomerization is entropy-driven and the combined effect of solvent and temperature can be used to achieve almost 50% thiol concentrations in solution. The nature of the electronic transitions was investigated theoretically and the assignment of the bands was made using time-dependent DFT as well as the influence of solvent on the energy of the most important bands of the spectra.