Sample records for reliable experimental approach

  1. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.


    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  2. Software Reliability Experimentation and Control

    Kai-Yuan Cai


    This paper classifies software researches as theoretical researches, experimental researches, and engineering researches, and is mainly concerned with the experimental researches with focus on software reliability experimentation and control. The state-of-the-art of experimental or empirical studies is reviewed. A new experimentation methodology is proposed, which is largely theory discovering oriented. Several unexpected results of experimental studies are presented to justify the importance of software reliability experimentation and control. Finally, a few topics that deserve future investigation are identified.

  3. New Approaches to Reliability Assessment

    Ma, Ke; Wang, Huai; Blaabjerg, Frede


    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......Power electronics are facing continuous pressure to be cheaper and smaller, have a higher power density, and, in some cases, also operate at higher temperatures. At the same time, power electronics products are expected to have reduced failures because it is essential for reducing the cost......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  4. Experimental Test and Simulations on a Linear Generator-Based Prototype of a Wave Energy Conversion System Designed with a Reliability-Oriented Approach

    Valeria Boscaino


    Full Text Available In this paper, we propose a reliability-oriented design of a linear generator-based prototype of a wave energy conversion (WEC, useful for the production of hydrogen in a sheltered water area like Mediterranean Sea. The hydrogen production has been confirmed by a lot of experimental testing and simulations. The system design is aimed to enhance the robustness and reliability and is based on an analysis of the main WEC failures reported in literature. The results of this analysis led to some improvements that are applied to a WEC system prototype for hydrogen production and storage. The proposed WEC system includes the electrical linear generator, the power conversion system, and a sea-water electrolyzer. A modular architecture is conceived to provide ease of extension of the power capability of the marine plant. The experimental results developed on the permanent magnet linear electric generator have allowed identification of the stator winding typology and, consequently, ability to size the power electronics system. The produced hydrogen has supplied a low-power fuel cell stack directly connected to the hydrogen output from the electrolyzer. The small-scale prototype is designed to be installed, in the near future, into the Mediterranean Sea. As shown by experimental and simulation results, the small-scale prototype is suitable for hydrogen production and storage from sea water in this area.

  5. Experimental approaches and applications

    Crasemann, Bernd


    Atomic Inner-Shell Processes, Volume II: Experimental Approaches and Applications focuses on the physics of atomic inner shells, with emphasis on experimental aspects including the use of radioactive atoms for studies of atomic transition probabilities. Surveys of modern techniques of electron and photon spectrometry are also presented, and selected practical applications of inner-shell processes are outlined. Comprised of six chapters, this volume begins with an overview of the general principles underlying the experimental techniques that make use of radioactive isotopes for inner-sh

  6. Dependent systems reliability estimation by structural reliability approach

    Kostandyan, Erik; Sørensen, John Dalsgaard


    ) and the component lifetimes follow some continuous and non-negative cumulative distribution functions. An illustrative example utilizing the proposed method is provided, where damage is modeled by a fracture mechanics approach with correlated components and a failure assessment diagram is applied for failure...... identification. Application of the proposed method can be found in many real world systems....

  7. Novel approach for evaluation of service reliability for electricity customers

    JIANG; John; N


    Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.

  8. Novel approach for evaluation of service reliability for electricity customers

    KANG ChongQing; GAO Yan; JIANG John N; ZHONG Jin; XIA Qing


    Understanding reliability value for electricity customer is important to market-based reliability man-agement. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network ex-pansion for different reliability requirements of customers, which reveals the information about eco-nomic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reli-ability management and enhancement.

  9. Different Reliability Assessment Approaches for Wave Energy Converters

    Ambühl, Simon; Kramer, Morten Mejlhede; Sørensen, John Dalsgaard


    Reliability assessments are of importance for wave energy converters (WECs) due to the fact that accessibility might be limited in case of failure and maintenance. These failure rates can be adapted by reliability considerations. There are two different approaches to how reliability can...... be estimated: the so-called classical reliability theory and the probabilistic reliability theory. The classical reliability theory is often used for failure rate estimations of mechanical and electrical components, whereas the probabilistic reliability theory is commonly used for structural components...

  10. Experimental design a chemometric approach

    Deming, SN


    Now available in a paperback edition is a book which has been described as `` exceptionally lucid, easy-to-read presentation... would be an excellent addition to the collection of every analytical chemist. I recommend it with great enthusiasm.'' (Analytical Chemistry). Unlike most current textbooks, it approaches experimental design from the point of view of the experimenter, rather than that of the statistician. As the reviewer in `Analytical Chemistry' went on to say: ``Deming and Morgan should be given high praise for bringing the principles of experimental design to the level of the p

  11. A Latent Class Approach to Estimating Test-Score Reliability

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas


    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  12. LMI approach to reliable H∞ control of linear systems

    Yao Bo; Wang Fuzhong


    The reliable design problem for linear systems is concerned with. A more practical model of actuator faults than outage is considered. An LMI approach of designing reliable controller is presented for the case of actuator faults that can be modeled by a scaling factor. The resulting control systems are reliable in that they provide guaranteed asymptotic stability and H∞ performance when some control component (actuator) faults occur. A numerical example is also given to illustrate the design procedure and their effectiveness. Furthermore, the optimal standard controller and the optimal reliable controller are compared to show the necessity of reliable control.

  13. Simulation Approach to Mission Risk and Reliability Analysis Project

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  14. The process group approach to reliable distributed computing

    Birman, Kenneth P.


    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  15. Engineering systems reliability, safety, and maintenance an integrated approach

    Dhillon, B S


    Today, engineering systems are an important element of the world economy and each year billions of dollars are spent to develop, manufacture, operate, and maintain various types of engineering systems around the globe. Many of these systems are highly sophisticated and contain millions of parts. For example, a Boeing jumbo 747 is made up of approximately 4.5 million parts including fasteners. Needless to say, reliability, safety, and maintenance of systems such as this have become more important than ever before.  Global competition and other factors are forcing manufacturers to produce highly reliable, safe, and maintainable engineering products. Therefore, there is a definite need for the reliability, safety, and maintenance professionals to work closely during design and other phases. Engineering Systems Reliability, Safety, and Maintenance: An Integrated Approach eliminates the need to consult many different and diverse sources in the hunt for the information required to design better engineering syste...

  16. Molecular approaches in experimental neuroimaging

    Tavitian, B. [CEA Saclay, 91 - Gif sur Yvette (France)


    We quantified and compared six parameters (resolution, depth, sensitivity, portability, quantification and cost) of four molecular imaging techniques (MRI, optics, ultrasound and TEP), with the three types of electromagnetic radiation used in vivo (Frequencies (10{sup 6} to 10{sup 22} Hz), Photonic Energy (10{sup -4} to 10{sup 9} eV) and Wavelengths (10{sup -2} to 10{sup -15} m)). This form of molecular imaging demands the most sensitive technique available (Pl. 26-2 to 26-4). Four examples of experimental in vivo approaches on small animals are shown: molecular passage through the blood-brain barrier (endothelial cells, astrocytes and occludin, pharmacokinetics, studied with PET) (Pl. 2-5 to 2-11); imaging of receptors and ligands, especially peripheral benzodiazepine receptors (PBR) by PET and MRI in the rat (Pl. 2-12 to Pl. 2-15); neuro-pathology of neuro-degenerative and inflammatory diseases and stroke by PET and MRI in the rat (Pl. 2-16 to 2-17); and the study of responses to stimulation explored with in vivo imaging of calcium signals and their variations by photonic analysis, as on the scale of mitochondrial calcium (Pl.2-18 to Pl.2-22). (author)

  17. Integrated approach to economical, reliable, safe nuclear power production


    An Integrated Approach to Economical, Reliable, Safe Nuclear Power Production is the latest evolution of a concept which originated with the Defense-in-Depth philosophy of the nuclear industry. As Defense-in-Depth provided a framework for viewing physical barriers and equipment redundancy, the Integrated Approach gives a framework for viewing nuclear power production in terms of functions and institutions. In the Integrated Approach, four plant Goals are defined (Normal Operation, Core and Plant Protection, Containment Integrity and Emergency Preparedness) with the attendant Functional and Institutional Classifications that support them. The Integrated Approach provides a systematic perspective that combines the economic objective of reliable power production with the safety objective of consistent, controlled plant operation.

  18. Sequential Bayesian technique: An alternative approach for software reliability estimation

    S Chatterjee; S S Alam; R B Misra


    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with some real life data

  19. Generating function approach to reliability analysis of structural systems


    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  20. Discrete event simulation versus conventional system reliability analysis approaches

    Kozine, Igor


    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  1. Reliability of four experimental mechanical pain tests in children

    Søe, Ann-Britt Langager; Thomsen, Lise L; Tornoe, Birte


    In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT) in healthy children. Furthermore, the aim...... was also to study the intersession reliability of the following four tests: (1) Total Tenderness Score; (2) PPT; (3) Visual Analog Scale score at suprapressure pain threshold; and (4) area under the curve (stimulus-response functions for pressure versus pain)....

  2. An efficient approach for reliability-based topology optimization

    Kanakasabai, Pugazhendhi; Dhingra, Anoop K.


    This article presents an efficient approach for reliability-based topology optimization (RBTO) in which the computational effort involved in solving the RBTO problem is equivalent to that of solving a deterministic topology optimization (DTO) problem. The methodology presented is built upon the bidirectional evolutionary structural optimization (BESO) method used for solving the deterministic optimization problem. The proposed method is suitable for linear elastic problems with independent and normally distributed loads, subjected to deflection and reliability constraints. The linear relationship between the deflection and stiffness matrices along with the principle of superposition are exploited to handle reliability constraints to develop an efficient algorithm for solving RBTO problems. Four example problems with various random variables and single or multiple applied loads are presented to demonstrate the applicability of the proposed approach in solving RBTO problems. The major contribution of this article comes from the improved efficiency of the proposed algorithm when measured in terms of the computational effort involved in the finite element analysis runs required to compute the optimum solution. For the examples presented with a single applied load, it is shown that the CPU time required in computing the optimum solution for the RBTO problem is 15-30% less than the time required to solve the DTO problems. The improved computational efficiency allows for incorporation of reliability considerations in topology optimization without an increase in the computational time needed to solve the DTO problem.

  3. Reliability of four experimental mechanical pain tests in children

    Soee AL


    Full Text Available Ann-Britt L Soee,1 Lise L Thomsen,2 Birte Tornoe,1,3 Liselotte Skov11Department of Pediatrics, Children’s Headache Clinic, Copenhagen University Hospital Herlev, Copenhagen, Denmark; 2Department of Neuropediatrics, Juliane Marie Centre, Copenhagen University Hospital Rigshospitalet, København Ø, Denmark; 3Department of Physiotherapy, Medical Department O, Copenhagen University Hospital Herlev, Herlev, DenmarkPurpose: In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT in healthy children. Furthermore, the aim was also to study the intersession reliability of the following four tests: (1 Total Tenderness Score; (2 PPT; (3 Visual Analog Scale score at suprapressure pain threshold; and (4 area under the curve (stimulus–response functions for pressure versus pain.Participants and methods: Twenty-five healthy school children, 8–14 years of age, participated. Test 2, PPT, was repeated three times at 2 minute intervals on the same day to estimate PPT intrasession reliability using Cronbach’s alpha. Tests 1–4 were repeated after median 21 (interquartile range 10.5–22 days, and Pearson’s correlation coefficient was used to describe the intersession reliability.Results: The PPT test was precise and reliable (Cronbach’s alpha ≥ 0.92. All tests showed a good to excellent correlation between days (intersessions r = 0.66–0.81. There were no indications of significant systematic differences found in any of the four tests between days.Conclusion: All tests seemed to be reliable measurements in pain evaluation in healthy children aged 8–14 years. Given the small sample size, this conclusion needs to be confirmed in future studies.Keywords: repeatability, intraindividual reliability, pressure pain threshold, pain measurement, algometer

  4. Experimental Approach to Teaching Fluids

    Stern, Catalina


    For the last 15 years we have promoted experimental work even in the theoretical courses. Fluids appear in the Physics curriculum of the National University of Mexico in two courses: Collective Phenomena in their sophomore year and Continuum Mechanics in their senior year. In both, students are asked for a final project. Surprisingly, at least 85% choose an experimental subject even though this means working extra hours every week. Some of the experiments were shown in this congress two years ago. This time we present some new results and the methodology we use in the classroom. I acknowledge support from the Physics Department, Facultad de Ciencias, UNAM.

  5. Exponentiated Weibull distribution approach based inflection S-shaped software reliability growth model

    B.B. Sagar


    Full Text Available The aim of this paper was to estimate the number of defects in software and remove them successfully. This paper incorporates Weibull distribution approach along with inflection S-shaped Software Reliability Growth Models (SRGM. In this combination two parameter Weibull distribution methodology is used. Relative Prediction Error (RPE is calculated to predict the validity criterion of the developed model. Experimental results on actual data from five data sets are compared with two other existing models, which expose that the proposed software reliability growth model predicts better estimation to remove the defects. This paper presents best software reliability growth model with including feature of both Weibull distribution and inflection S-shaped SRGM to estimate the defects of software system, and provide help to researchers and software industries to develop highly reliable software products.

  6. A General Approach to Study the Reliability of Complex Systems

    G. M. Repici


    Full Text Available In recent years new complex systems have been developed in the automotive field to increase safety and comfort. These systems integrate hardware and software to guarantee the best results in vehicle handling and make products competitive on the market.However, the increase in technical details and the utilization and integration of these complicated systems require a high level of dynamic control system reliability. In order to improve this fundamental characteristic methods can be extracted from methods used in the aeronautical field to deal with reliability and these can be integrated into one simplified method for application in the automotive field.Firstly, as a case study, we decided to analyse VDC (the Vehicle Dynamics Control system by defining a possible approach to reliability techniques. A VDC Fault Tree Analysis represents the first step in this activity: FTA enables us to recognize the critical components in all possible working conditions of a car, including cranking, during 'key-on'-'key-off ' phases, which is particularly critical for the electrical on-board system (because of voltage reduction.By associating FA (Functional Analysis and FTA results with a good FFA (Functional Failure Analysis, it is possible to define the best architecture for the general system to achieve the aim of a high reliability structure.The paper will show some preliminary results from the application of this methodology, taken from various typical handling conditions from well established test procedures for vehicles.

  7. A new approach to reliability assessment of dental caries examinations.

    Altarakemah, Yacoub; Al-Sane, Mona; Lim, Sungwoo; Kingman, Albert; Ismail, Amid I


    The objective of this study is to evaluate reliability of the International Caries Detection and Assessment System (ICDAS) and identify sources of disagreement among eight Kuwaiti dentists with no prior knowledge of the system. A 90-min introductory e-course was introduced followed by an examination of extracted teeth using the ICDAS coding system on the first day. Then three sessions of clinical examinations were performed. This study only used the data from the last session where 705 tooth surfaces of 10 patients were examined to assess bias in caries examination and on which codes the examiners had the highest disagreement. Compared with the gold standard, we evaluated bias of the ICDAS coding using three approaches (Bland-Altman plot, maximum kappa statistic, and Bhapkar's chi-square test). Linear weighted kappa statistics were computed to assess interexaminer reliability. Marginal ICDAS distributions for most examiners were significantly different from that of the gold standard (bias present). The primary source of these marginal differences was misclassifying sound surfaces as noncavitated lesions. Interexaminer reliability of the 3-level ICDAS (codes 0, 1-2, and 3-6) classification ranged between 0.43 and 0.73, indicating evidence of substantial inconsistency between examiners. The primary source of examiner differences was agreeing on diagnosing noncavitated lesions. This study highlights the importance of assessing both systematic and random sources of examiner agreement to correctly interpret kappa measures of reliability. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Monalisa Sarma


    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  9. A Thermodynamic Entropy Approach to Reliability Assessment with Applications to Corrosion Fatigue

    Anahita Imanian


    Full Text Available This paper outlines a science-based explanation of damage and reliability of critical components and structures within the second law of thermodynamics. The approach relies on the fundamentals of irreversible thermodynamics, specifically the concept of entropy generation as an index of degradation and damage in materials. All damage mechanisms share a common feature, namely energy dissipation. Dissipation, a fundamental measure for irreversibility in a thermodynamic treatment of non-equilibrium processes, is quantified by entropy generation. An entropic-based damage approach to reliability and integrity characterization is presented and supported by experimental validation. Using this theorem, which relates entropy generation to dissipative phenomena, the corrosion fatigue entropy generation function is derived, evaluated, and employed for structural integrity and reliability assessment of aluminum 7075-T651 specimens.

  10. A fast and reliable overset unstructured grids approach

    Zhong-Liang Kang; Chao Yan; Jian Yu; Yuan-Yuan Fang


    A cell-centred overset unstructured grids approach is developed.In this approach,the intergrid boundary is initially established based on the wall distance from the cell centre,and is then optimized.To accelerate the intergrid-boundary definition much more,a neighbor-to-neighbor donor search algorithm based on advancing-front method is modified with the help of minimum cuboid boxes.To simplify the communications between different grid cell types and to obtain second-order spatial accuracy,a new interpolation method is constructed based on linear reconstruction,which employs only one layer of fringe cells along the intergrid boundary.For unsteady flows with relative motion,the intergrid boundary can be redefined fast and automatically.Several numerical results show that the present dynamic overset unstructured grids approach is accurate and reliable.

  11. Reliability of single sample experimental designs: comfortable effort level.

    Brown, W S; Morris, R J; DeGroot, T; Murry, T


    This study was designed to ascertain the intrasubject variability across multiple recording sessions-most often disregarded in reporting group mean data or unavailable because of single sample experimental designs. Intrasubject variability was assessed within and across several experimental sessions from measures of speaking fundamental frequency, vocal intensity, and reading rate. Three age groups of men and women--young, middle-aged, and elderly--repeated the vowel /a/, read a standard passage, and spoke extemporaneously during each experimental session. Statistical analyses were performed to assess each speaker's variability from his or her own mean, and that which consistently varied for any one speaking sample type, both within or across days. Results indicated that intrasubject variability was minimal, with approximately 4% of the data exhibiting significant variation across experimental sessions.

  12. Reliability assessment using degradation models: bayesian and classical approaches

    Marta Afonso Freitas


    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.


    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing


    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  14. Reliability Approach for Machine Protection Design in Particle Accelerators

    Apollonio, A; Mikulec, B; Puccio, B; Sanchez Alvarez, J L; Schmidt, R; Wagner, S


    Particle accelerators require Machine Protection Systems (MPS) to prevent beam-induced damage of equipment in case of failures. This becomes increasingly important for proton colliders with large energy stored in the beam such as LHC, for high power accelerators with a beam power of up to 10 MW, such as the European Spallation Source (ESS), and for linear colliders with high beam power and very small beam size. The reliability of Machine Protection Systems is crucial for safe machine operation; all possible sources of risk need to be taken into account in the early design stage. This paper presents a systematic approach to classify failures and to assess the associated risk, and discusses the impact of such considerations on the design of Machine Protection Systems. The application of this approach will be illustrated using the new design of the MPS for LINAC4, a linear accelerator under construction at CERN.

  15. Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST

    Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.


    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

  16. Unlocking water markets: an experimental approach

    Cook, J.; Rabotyagov, S.


    Water markets are frequently referred to as a promising approach to alleviate stress on water systems, especially as future hydrologic assessments suggest increasing demand and less reliable supply. Yet, despite decades of advocacy by water resource economists, water markets (leases and sales of water rights between willing buyers and sellers) have largely failed to develop in the western US. Although there are a number of explanations for this failure, we explore one potential reason that has received less attention : farmers as sellers may have preferences for different elements of a water market transaction that are not captured in the relative comparison of their profits from farming and their profits from agreeing to a deal. We test this explanation by recruiting irrigators with senior water rights in the upper Yakima River Basin in Washington state to participate in a series of experimental auctions. In concept, the Yakima Basin is well situated for water market transactions as it has significant water shortages for junior water users ~15% of years and projections show these are likely to increase in the future. Participants were asked a series of questions about the operation of a hypothetical 100-acre timothy hay farm including the type of buyer, how the water bank is managed, the lease type, and the offer price. Results from 7 sessions with irrigators (n=49) and a comparison group of undergraduates (n=38) show that irrigators are more likely to accept split-season than full-season leases (controlling for differences in farm profits) and are more likely to accept a lease from an irrigation district and less likely to accept an offer from a Developer. Most notably, we find farmers were far more likely than students to reject offers from buyers even though it would increase their winnings from the experiment. These results could be used in ongoing water supply policy debates in the Yakima Basin to simulate the amount of water that could be freed by water

  17. Increasing phytoremediation efficiency and reliability using novel omics approaches.

    Bell, Terrence H; Joly, Simon; Pitre, Frédéric E; Yergeau, Etienne


    Phytoremediation is a cost-effective green alternative to traditional soil remediation technologies, but has experienced varied success in practice. The recent omics revolution has led to leaps in our understanding of soil microbial communities and plant metabolism, and some of the conditions that promote predictable activity in contaminated soils and heterogeneous environments. Combinations of omics tools and new bioinformatics approaches will allow us to understand integrated activity patterns between plants and microbes, and determine how this metaorganism can be modified to maximize growth, appropriate assembly of microbial communities, and, ultimately, phytoremediation activity. Here we provide an overview of how new omics-mediated discoveries can potentially be translated into an effective and reliable environmental technology.

  18. Experimental design research approaches, perspectives, applications

    Stanković, Tino; Štorga, Mario


    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  19. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Seiller, G.; Anctil, F.; Roy, R.


    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  20. Experimental Approaches to Studying Biological Electron Transfer.

    Scott, Robert A.; And Others


    Provides an overview on biological electron-transfer reactions, summarizing what is known about how distance, spatial organization, medium, and other factors affect electron transfer. Experimental approaches, including studies of bimolecular electron transfer reactions (electrostatic effects and precursor complexes), are considered. (JN)

  1. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  2. Wind turbine reliability : a database and analysis approach.

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)


    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  3. Toward a Cooperative Experimental System Development Approach


    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis......, however, not limited to this development context, it may be applied for in-house or contract development as well. In system development, particularly in cooperative and experimental system development, we argue that it is necessary to analytically separate the abstract concerns, e.g. analysis, design...

  4. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    Chu, Tsong-Lun [Brookhaven National Lab. (BNL), Upton, NY (United States); Varuttamaseni, Athi [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, Joo-Seok [Brookhaven National Lab. (BNL), Upton, NY (United States)


    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).

  5. Experimental Finite Element Approach for Stress Analysis

    Ahmet Erklig


    Full Text Available This study aims to determining the strain gauge location points in the problems of stress concentration, and it includes both experimental and numerical results. Strain gauges were proposed to be positioned to corresponding locations on beam and blocks to related node of elements of finite element models. Linear and nonlinear cases were studied. Cantilever beam problem was selected as the linear case to approve the approach and conforming contact problem was selected as the nonlinear case. An identical mesh structure was prepared for the finite element and the experimental models. The finite element analysis was carried out with ANSYS. It was shown that the results of the experimental and the numerical studies were in good agreement.

  6. Bayesian approach in the power electric systems study of reliability ...

    During the applications change to all fields of engineering, the discipline has, over the years, ... Characterization of the failure rate as a random variable .... the reliability performance is defined as the conditional probability that the system ...

  7. Monte Carlo simulation - a powerful tool to support experimental activities in structure reliability

    Yuritzinn, T. [CEA Saclay, Dept. de Mecanique et de Technologie (DRN/DMT/SEMT/LISN), 91 - Gif-sur-Yvette (France); Chapuliot, S. [CEA Saclay, Dept. Modelisation de Systemes et Structures (DM2S/SEMT), 91 - Gif sur Yvette (France); Eid, M. [CEA Saclay, Dept. de Mecanique et de Technologie (DRN/DMT/SERMA/LCA), 91 - Gif-sur-Yvette (France); Masson, R.; Dahl, A.; Moinereau, D. [Electricite de France (EDF), 75 - Paris (France)


    Monte-Carlo Simulation (MCS) can have different uses in supporting structure reliability investigations and assessments. In this paper we focus our interest on the use of MCS as a numerical tool to support the fitting of the experimental data related to toughness experiments. (authors)

  8. Monitoring Software Reliability using Statistical Process Control: An MMLE Approach

    Bandla Sreenivasa Rao


    Full Text Available This paper consider an MMLE (Modified Maximum Likelihood Estimation based scheme to estimatesoftware reliability using exponential distribution. The MMLE is one of the generalized frameworks ofsoftware reliability models of Non Homogeneous Poisson Processes (NHPPs. The MMLE givesanalytical estimators rather than an iterative approximation to estimate the parameters. In this paper weproposed SPC (Statistical Process Control Charts mechanism to determine the software quality usinginter failure times data. The Control charts can be used to measure whether the software process isstatistically under control or not.

  9. A Numerical Simulation Approach for Reliability Evaluation of CFRP Composite

    Liu, D. S.-C.; Jenab, K.


    Due to the superior mechanical properties of carbon fiber reinforced plastic (CFRP) materials, they are vastly used in industries such as aircraft manufacturers. The aircraft manufacturers are switching metal to composite structures while studying reliability (R-value) of CFRP. In this study, a numerical simulation method to determine the reliability of Multiaxial Warp Knitted (MWK) textiles used to make CFRP composites is proposed. This method analyzes the distribution of carbon fiber angle misalignments, from a chosen 0° direction, caused by the sewing process of the textile, and finds the R-value, a value between 0 and 1. The application of this method is demonstrated by an illustrative example.

  10. Involving students in experimental design: three approaches.

    McNeal, A P; Silverthorn, D U; Stratton, D B


    Many faculty want to involve students more actively in laboratories and in experimental design. However, just "turning them loose in the lab" is time-consuming and can be frustrating for both students and faculty. We describe three different ways of providing structures for labs that require students to design their own experiments but guide the choices. One approach emphasizes invertebrate preparations and classic techniques that students can learn fairly easily. Students must read relevant primary literature and learn each technique in one week, and then design and carry out their own experiments in the next week. Another approach provides a "design framework" for the experiments so that all students are using the same technique and the same statistical comparisons, whereas their experimental questions differ widely. The third approach involves assigning the questions or problems but challenging students to design good protocols to answer these questions. In each case, there is a mixture of structure and freedom that works for the level of the students, the resources available, and our particular aims.


    A.C. Rooney


    Full Text Available

    ENGLISH ABSTRACT: This paper proposes a reliability management process for the development of complex electromechanical systems. Specific emphasis is the development of these systems in an environment of limited development resources, and where small production quantities are envisaged.
    The results of this research provides a management strategy for reliability engineering activities, within a systems engineering environment, where concurrent engineering techniques are used to reduce development cycles and costs.

    AFRIKAANSE OPSOMMING: Hierdie artikel stel 'n proses, vir die bestuur van die betroubaarheid gedurende die ontwikkeling van komplekse elektromeganiese stelsels voor. Die omgewing van beperkte ontwikkelingshulpbronne en klein produksie hoeveelhede word beklemtoon.
    Die resultate van hierdie navorsing stel 'n bestuurstrategie, vir betroubaarheidsbestuur in n stelselsingenieurswese omgewing waar gelyktydige ingenieurswese tegnieke gebruik word am die ontwikkelingsiklus en -kostes te beperk, voor.

  12. The Process Group Approach to Reliable Distributed Computing


    under DARPA/NASA grant NAG-2-593, and by grants from EBM , HP, Siemens, GTE and Hitachi. I Ir in I a i gress SW Shwnu i Pnc" IBU r 00 8 133-1/4 1BM...system, but could make it harder to administer and less reliable. A theme of the paper will be that one overcomes this intrinsic problem by standardizing

  13. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    Chen, B. H.; Micheletti, M.; Baganz, F.;


    design. It incorporates a model driven approach to the experimental design that minimises the number of experiments to be performed, while still generating accurate values of kinetic parameters. The approach has been illustrated with the transketolase mediated asymmetric synthesis of L...... experimental design.]it comparison with conventional methodology, the modelling approach enabled a nearly 4-fold decrease in the number of experiments while the microwell experimentation enabled a 45-fold decrease in material requirements and a significant increase in experimental throughput. The approach......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments...

  14. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Samimi, Parnia; Ravana, Sri Devi


    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  15. Identification of Black Spots Based on Reliability Approach

    Ahmadreza Ghaffari


    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  16. An experimental approach to submarine canyon evolution

    Lai, Steven Y. J.; Gerber, Thomas P.; Amblas, David


    We present results from a sandbox experiment designed to investigate how sediment gravity flows form and shape submarine canyons. In the experiment, unconfined saline gravity flows were released onto an inclined sand bed bounded on the downstream end by a movable floor that was used to increase relief during the experiment. In areas unaffected by the flows, we observed featureless, angle-of-repose submarine slopes formed by retrogressive breaching processes. In contrast, areas influenced by gravity flows cascading across the shelf break were deeply incised by submarine canyons with well-developed channel networks. Normalized canyon long profiles extracted from successive high-resolution digital elevation models collapse to a single profile when referenced to the migrating shelf-slope break, indicating self-similar growth in the relief defined by the canyon and intercanyon profiles. Although our experimental approach is simple, the resulting canyon morphology and behavior appear similar in several important respects to that observed in the field.

  17. A damage mechanics based approach to structural deterioration and reliability

    Bhattcharya, B.; Ellingwood, B. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering


    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions to the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.

  18. A heuristic-based approach for reliability importance assessment of energy producers

    Akhavein, A., E-mail: [Department of Engineering, Science and Research Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Fotuhi Firuzabad, M., E-mail: fotuhi@sharif.ed [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of)


    Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: {yields} Required reliability level at load points is a concern in modern power systems. {yields} It is important to assess reliability importance of energy producers or generators. {yields} Generators can be ranked based on their impacts on power flow to a selected area. {yields} Ranking of generators is an efficient tool to assess their reliability importance.


    LIU Deshun; YUE Wenhui; ZHU Pingyu; DU Xiaoping


    Conventional reliability-based design optimization (RBDO) requires to use the most probable point (MPP) method for a probabilistic analysis of the reliability constraints. A new approach is presented, called as the minimum error point (MEP) method or the MEP based method,for reliability-based design optimization, whose idea is to minimize the error produced by approximating performance functions. The MEP based method uses the first order Taylor's expansion at MEP instead of MPP. Examples demonstrate that the MEP based design optimization can ensure product reliability at the required level, which is very imperative for many important engineering systems. The MEP based reliability design optimization method is feasible and is considered as an alternative for solving reliability design optimization problems. The MEP based method is more robust than the commonly used MPP based method for some irregular performance functions.

  20. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Yury Fedotov


    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  1. Intracerebroventricular administration of streptozotocin as an experimental approach to Alzheimer's disease.

    Kalafatakis, Konstantinos; Zarros, Apostolos


    The in vivo experimental simulation of Alzheimer's disease (AD) has been a field of paramount importance for Experimental Medicine and Neuroscience for more than 20 years. We herein provide a short overview of an experimental approach to sporadic AD that is based on the insulin-resistant state induced in the brains of animals following the intracerebroventricular (icv) administration of streptozotocin (STZ) at low doses. The icv administration of STZ is considered as an established, standardized and reproducible approach to sporadic AD, central aspects of the pathology of which it can reliably simulate.

  2. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    Sozer, Hasan; Tekinerdogan, Bedir; Aksit, Mehmet; Lemos, de Rogerio; Gacek, Cristina


    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.


    Muradian, L. A.


    Purpose. The article aims to develop an algorithm and a sequence of description and determination of car reliability to predict certain quantitative indicators of the studied elements, parts and units, or a car as a whole on the basis of probabilistic-physical approach. Methodology. For the calculation of the indicators of reliability, durability and safety of cars the probabilistic-physical method was used, which takes into account the resources consumption inevitable in the operation of car...

  4. Experimental results of fingerprint comparison validity and reliability: A review and critical analysis.

    Haber, Ralph Norman; Haber, Lyn


    Our purpose in this article is to determine whether the results of the published experiments on the accuracy and reliability of fingerprint comparison can be generalized to fingerprint laboratory casework, and/or to document the error rate of the Analysis-Comparison-Evaluation (ACE) method. We review the existing 13 published experiments on fingerprint comparison accuracy and reliability. These studies comprise the entire corpus of experimental research published on the accuracy of fingerprint comparisons since criminal courts first admitted forensic fingerprint evidence about 120years ago. We start with the two studies by Ulery, Hicklin, Buscaglia and Roberts (2011, 2012), because they are recent, large, designed specifically to provide estimates of the accuracy and reliability of fingerprint comparisons, and to respond to the criticisms cited in the National Academy of Sciences Report (2009). Following the two Ulery et al. studies, we review and evaluate the other eleven experiments, considering problems that are unique to each. We then evaluate the 13 experiments for the problems common to all or most of them, especially with respect to the generalizability of their results to laboratory casework. Overall, we conclude that the experimental designs employed deviated from casework procedures in critical ways that preclude generalization of the results to casework. The experiments asked examiner-subjects to carry out their comparisons using different responses from those employed in casework; the experiments presented the comparisons in formats that differed from casework; the experiments enlisted highly trained examiners as experimental subjects rather than subjects drawn randomly from among all fingerprint examiners; the experiments did not use fingerprint test items known to be comparable in type and especially in difficulty to those encountered in casework; and the experiments did not require examiners to use the ACE method, nor was that method defined

  5. Experimental and numerical investigations on reliability of air barrier on oil containment in flowing water.

    Lu, Jinshu; Xu, Zhenfeng; Xu, Song; Xie, Sensen; Wu, Haoxiao; Yang, Zhenbo; Liu, Xueqiang


    Air barriers have been recently developed and employed as a new type of oil containment boom. This paper presents systematic investigations on the reliability of air barriers on oil containments with the involvement of flowing water, which represents the commonly-seen shearing current in reality, by using both laboratory experiments and numerical simulations. Both the numerical and experimental investigations are carried out in a model scale. In the investigations, a submerged pipe with apertures is installed near the bottom of a tank to generate the air bubbles forming the air curtain; and, the shearing water flow is introduced by a narrow inlet near the mean free surface. The effects of the aperture configurations (including the size and the spacing of the aperture) and the location of the pipe on the effectiveness of the air barrier on preventing oil spreading are discussed in details with consideration of different air discharges and velocities of the flowing water. The research outcome provides a foundation for evaluating and/or improve the reliability of a air barrier on preventing spilled oil from further spreading.

  6. Experimental investigation of the reliability issue of RRAM based on high resistance state conduction.

    Zhang, Lijie; Hsu, Yen-Ya; Chen, Frederick T; Lee, Heng-Yuan; Chen, Yu-Sheng; Chen, Wei-Su; Gu, Pei-Yi; Liu, Wen-Hsing; Wang, Shun-Min; Tsai, Chen-Han; Huang, Ru; Tsai, Ming-Jinn


    In this paper, reliability issues of robust HfO(x)-based RRAM are experimentally investigated in terms of cycling ageing, temperature impact and voltage acceleration. All reliability issues can be estimated by the conduction of the high resistance state (HRS). The conduction current of the HRS exponentially increases as the square root of the applied voltage, which is well explained by 'quasi-Poole-Frenkel-type' trap assistant tunneling. Further experiments on HRS conduction at different temperatures show that the depth of the potential well of the trap in HfO(x) film is about 0.31 eV. The degradation induced by the cycling ageing is possibly ascribed to the increase of the amount of oxygen ions in the TiO(x) layer of the TiN/TiO(x)/HfO(x)/TiN device. The retention times with various stress voltages at different temperatures also exhibit an exponential relationship to the square root of the applied voltage, indicating that stress current plays a dominant role for the degradation of the HRS. An oxygen-release model is proposed to explain the relationship of retention time to HRS conduction current.

  7. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    Steven E. Stemler


    Full Text Available This article argues that the general practice of describing interrater reliability as a single, unified concept best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1 consensus estimates, 2 consistency estimates, and 3 measurement estimates...The assumptions, interpretation, advantages, and disadvantages of estimates from each of these three..categories are discussed, along with several popular methods of computing interrater reliability coefficients..that fall under the umbrella of consensus, consistency, and measurement estimates. Researchers and..practitioners should be aware that different approaches to estimating interrater reliability carry with them..different implications for how ratings across multiple judges should be summarized, which may impact the..validity of subsequent study results.


    Ahmad A. Moreb


    Reliability allocation problem is commonly treated using a closed-form expression relating the cost to reliability. A recent approach has introduced the use of discrete integer technique for un-repairable systems. This research addresses the allocation problem for repairable systems. It presents an integer formulation for finding the optimum selection of components based on the integer values of their Mean Time to Failure (MTTF) and Mean Time to Repair (MTTR). The objective is to minimize the total cost under a system reliability constraint, in addition to other physical constraints. Although, a closed-form expression relating the cost to reliability may not be a linear; however, in this research, the objective function will always be linear regardless of the shape of the equivalent continuous closed-form function. An example is solved using the proposed method and compared with the solution of the continuous closed-form version. The formulation for all possible system configurations, components and subsystems are also considered.

  9. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Fabio De Felice


    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  10. A fast and reliable empirical approach for estimating solubility of crystalline drugs in polymers for hot melt extrusion formulations.

    Kyeremateng, Samuel O; Pudlas, Marieke; Woehrle, Gerd H


    A novel empirical analytical approach for estimating solubility of crystalline drugs in polymers has been developed. The approach utilizes a combination of differential scanning calorimetry measurements and a reliable mathematical algorithm to construct complete solubility curve of a drug in polymer. Compared with existing methods, this novel approach reduces the required experimentation time and amount of material by approximately 80%. The predictive power and relevance of such solubility curves in development of amorphous solid dispersion (ASD) formulations are shown by applications to a number of hot-melt extrudate formulations of ibuprofen and naproxen in Soluplus. On the basis of the temperature-drug load diagrams using the solubility curves and the glass transition temperatures, physical stability of the extrudate formulations was predicted and checked by placing the formulations on real-time stability studies. An analysis of the stability samples with microscopy, thermal, and imaging techniques confirmed the predicted physical stability of the formulations. In conclusion, this study presents a fast and reliable approach for estimating solubility of crystalline drugs in polymer matrixes. This powerful approach can be applied by formulation scientists as an early and convenient tool in designing ASD formulations for maximum drug load and physical stability.

  11. Gas phase reactive collisions, experimental approach

    Canosa A.


    Full Text Available Since 1937 when the first molecule in space has been identified, more than 150 molecules have been detected. Understanding the fate of these molecules requires having a perfect view of their photochemistry and reactivity with other partners. It is then crucial to identify the main processes that will produce and destroy them. In this chapter, a general view of experimental techniques able to deliver gas phase chemical kinetics data at low and very low temperatures will be presented. These techniques apply to the study of reactions between neutral reactants on the one hand and reactions involving charge species on the other hand.

  12. Semiconductor laser engineering, reliability and diagnostics a practical approach to high power and single mode devices

    Epperlein, Peter W


    This reference book provides a fully integrated novel approach to the development of high-power, single-transverse mode, edge-emitting diode lasers by addressing the complementary topics of device engineering, reliability engineering and device diagnostics in the same book, and thus closes the gap in the current book literature. Diode laser fundamentals are discussed, followed by an elaborate discussion of problem-oriented design guidelines and techniques, and by a systematic treatment of the origins of laser degradation and a thorough exploration of the engineering means to enhance the optical strength of the laser. Stability criteria of critical laser characteristics and key laser robustness factors are discussed along with clear design considerations in the context of reliability engineering approaches and models, and typical programs for reliability tests and laser product qualifications. Novel, advanced diagnostic methods are reviewed to discuss, for the first time in detail in book literature, performa...

  13. Reliability Assessment of Fuel Cell System - A Framework for Quantitative Approach

    Lee, Shinae; Zhou, Dao; Wang, Huai


    such as component failures, the system architecture, and operational strategies. This paper suggests an approach that includes Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), and Reliability Block Diagram (RBD). For a case study, and the service lifetime of a commercial 5 kW Proton Exchange...

  14. An Approach for Evaluating Reliability of Man-Machine System in Evolving Environment

    Wang Wuhong; Zhang Dianye; Cao Qi


    In this paper,the technique for human error rate prediction (THERP) is first presented to discuss rationale and principal advantages and disadvantages.Then, based on operator behaviour paradigm which can describe human characteristics,an approach is formulated in a mathematical way as a means of evaluating the effect of operator erroneous actions on reliability of manmachine systems in dynamical evolving environment.

  15. Flaw shape reconstruction – an experimental approach

    Marilena STANCULESCU


    Full Text Available Flaws can be classified as acceptable and unacceptable flaws. As a result of nondestructive testing, one takes de decision Admit/Reject regarding the tested product related to some acceptability criteria. In order to take the right decision, one should know the shape and the dimension of the flaw. On the other hand, the flaws considered to be acceptable, develop in time, such that they can become unacceptable. In this case, the knowledge of the shape and dimension of the flaw allows determining the product time life. For interior flaw shape reconstruction the best procedure is the use of difference static magnetic field. We have a stationary magnetic field problem, but we face the problem given by the nonlinear media. This paper presents the results of the experimental work for control specimen with and without flaw.

  16. Experimental approach for optimizing dry fabric formability

    Allaoui, S; Wendling, A; Soulat, D; Chatel, S


    In order to understand the mechanisms involved in the forming step of LCM processes and provide validation data to numerical models, a specific experimental device has been designed in collaboration between PRISME Institute and EADS. This toot also makes it possible to test the feasibility to obtain specific double curved shape constituted with dry fabric reinforcement. It contains one mechanical module containing the classical tools in forming process, (punch, blank holder, and open-die), and one optical module to measure the 3D-deformed shape and the distribution of local deformations, like shear angles of the woven reinforcement during all the process. The goal of this paper is to present the potentialities and the first results obtained with this device.

  17. Human brain mapping: Experimental and computational approaches

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)


    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  18. Human brain mapping: Experimental and computational approaches

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)


    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  19. Assessing the reliability of ecotoxicological studies: An overview of current needs and approaches.

    Moermond, Caroline; Beasley, Amy; Breton, Roger; Junghans, Marion; Laskowski, Ryszard; Solomon, Keith; Zahner, Holly


    In general, reliable studies are well designed and well performed, and enough details on study design and performance are reported to assess the study. For hazard and risk assessment in various legal frameworks, many different types of ecotoxicity studies need to be evaluated for reliability. These studies vary in study design, methodology, quality, and level of detail reported (e.g., reviews, peer-reviewed research papers, or industry-sponsored studies documented under Good Laboratory Practice [GLP] guidelines). Regulators have the responsibility to make sound and verifiable decisions and should evaluate each study for reliability in accordance with scientific principles regardless of whether they were conducted in accordance with GLP and/or standardized methods. Thus, a systematic and transparent approach is needed to evaluate studies for reliability. In this paper, 8 different methods for reliability assessment were compared using a number of attributes: categorical versus numerical scoring methods, use of exclusion and critical criteria, weighting of criteria, whether methods are tested with case studies, domain of applicability, bias toward GLP studies, incorporation of standard guidelines in the evaluation method, number of criteria used, type of criteria considered, and availability of guidance material. Finally, some considerations are given on how to choose a suitable method for assessing reliability of ecotoxicity studies. Integr Environ Assess Manag. ©2016 SETAC.

  20. Experimental approach to fission process of actinides

    Baba, Hiroshi [Osaka Univ., Toyonaka (Japan). Faculty of Science


    From experimental views, it seems likely that the mechanism of nuclear fission process remains unsolved even after the Bohr and Weeler`s study in 1939. Especially, it is marked in respect of mass distribution in unsymmetric nuclear fission. The energy dependency of mass distribution can be explained with an assumption of 2-mode nuclear fission. Further, it was demonstrated that the symmetrical fission components and the unsymmetrical ones have different saddle and fission points. Thus, the presence of the 2-mode fission mechanism was confirmed. Here, transition in the nuclear fission mechanism and its cause were investigated here. As the cause of such transition, plausible four causes; a contribution of multiple-chance fission, disappearance of shell effects, beginning of fission following collective excitation due to GDR and nuclear phase transition were examined in the condition of excitation energy of 14.0 MeV. And it was suggested that the transition in the nuclear fission concerned might be related to phase transition. In addition, the mechanism of nuclear fission at a low energy and multi-mode hypothesis were examined by determination of the energy for thermal neutron fission ({sup 233,235}U and {sup 239}Pu) and spontaneous nuclear fission ({sup 252}Cf). (M.N.)

  1. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Hua-Feng He


    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  2. An Efficient Approach for the Reliability Analysis of Phased-Mission Systems with Dependent Failures

    Xing, Liudong; Meshkat, Leila; Donahue, Susan K.


    We consider the reliability analysis of phased-mission systems with common-cause failures in this paper. Phased-mission systems (PMS) are systems supporting missions characterized by multiple, consecutive, and nonoverlapping phases of operation. System components may be subject to different stresses as well as different reliability requirements throughout the course of the mission. As a result, component behavior and relationships may need to be modeled differently from phase to phase when performing a system-level reliability analysis. This consideration poses unique challenges to existing analysis methods. The challenges increase when common-cause failures (CCF) are incorporated in the model. CCF are multiple dependent component failures within a system that are a direct result of a shared root cause, such as sabotage, flood, earthquake, power outage, or human errors. It has been shown by many reliability studies that CCF tend to increase a system's joint failure probabilities and thus contribute significantly to the overall unreliability of systems subject to CCF.We propose a separable phase-modular approach to the reliability analysis of phased-mission systems with dependent common-cause failures as one way to meet the above challenges in an efficient and elegant manner. Our methodology is twofold: first, we separate the effects of CCF from the PMS analysis using the total probability theorem and the common-cause event space developed based on the elementary common-causes; next, we apply an efficient phase-modular approach to analyze the reliability of the PMS. The phase-modular approach employs both combinatorial binary decision diagram and Markov-chain solution methods as appropriate. We provide an example of a reliability analysis of a PMS with both static and dynamic phases as well as CCF as an illustration of our proposed approach. The example is based on information extracted from a Mars orbiter project. The reliability model for this orbiter considers

  3. A Simplified Quantitative Approach to the Reliability of a Passive Safety System

    Han, Seok-Jung; Yang, Joon-Eon; Lee, Won-Jea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    A simplified quantitative approach to the Reliability of a Passive safety System (RoPS) has been proposed for a risk estimation of a Very High Temperature Reactor (VHTR) for a hydrogen conversion. A passive safety system that has a high reliability has being introduced to next generation reactors for a enhanced safety. The current risk estimation includes small parts of passive components such as pipes and check valves, but it does not consider a large-scale passive system adopted in the next generation reactor. There is no approved method to estimate the RoPS. The RoPS is a technical issue for a future reactor development.

  4. A Reliability-Based Approach to Nonrepairable Spare Part Forecasting in Aircraft Maintenance System

    Nataša Z. Kontrec


    Full Text Available In recent times, spare parts inventory systems have been extensively researched, but most inventory models were not fully adequate due to the stochastic nature of inventory environment. This paper proposes an approach that supports decision making process in planning and controlling of spare parts in aircraft maintenance systems. Reliability characteristics of aircraft consumable parts were analyzed in order to substantiate this approach. Moreover, the proposed reliability model was used to evaluate characteristics of subassemblies and/or assemblies these parts belong to. Finally, an innovative approach for determining the total amount of parts required in inventory and underage costs, based on observing the total unit time as a stochastic process, is presented herein.

  5. Reliability and predictors of resistive load detection in children with persistent asthma: a multivariate approach.

    Harver, Andrew; Dyer, Allison; Ersek, Jennifer L; Kotses, Harry; Humphries, C Thomas


    Resistive load detection tasks enable analysis of individual differences in psychophysical outcomes. The purpose of this study was to determine both the reliability and predictors of resistive load detection in children with persistent asthma who completed multiple testing sessions. Both University of North Carolina (UNC) Charlotte and Ohio University institutional review boards approved the research protocol. The detection of inspiratory resistive loads was evaluated in 75 children with asthma between 8 and 15 years of age. Each child participated in four experimental sessions that occurred approximately once every 2 weeks. Multivariate analyses were used to delineate predictors of task performance. Reliability of resistive load detection was determined for each child, and predictors of load detection outcomes were investigated in two groups of children: those who performed reliably in all four sessions (n = 31) and those who performed reliably in three or fewer sessions (n = 44). Three factors (development, symptoms, and compliance) accounted for 66.3% of the variance among variables that predicted 38.7% of the variance in load detection outcomes (Multiple R = 0.62, p = 0.004) and correctly classified performance as reliable or less reliable in 80.6% of the children, χ(2)(12) = 28.88, p = 0.004. Cognitive and physical development, appraisal of symptom experiences, and adherence-related behaviors (1) account for a significant proportion of the interrelationships among variables that affect perception of airflow obstruction in children with asthma and (2) differentiate between children who perform more or less reliably in a resistive load detection task.

  6. Understanding bimolecular machines: Theoretical and experimental approaches

    Goler, Adam Scott

    This dissertation concerns the study of two classes of molecular machines from a physical perspective: enzymes and membrane proteins. Though the functions of these classes of proteins are different, they each represent important test-beds from which new understanding can be developed by the application of different techniques. HIV1 Reverse Transcriptase is an enzyme that performs multiple functions, including reverse transcription of RNA into an RNA/DNA duplex, RNA degradation by the RNaseH domain, and synthesis of dsDNA. These functions allow for the incorporation of the retroviral genes into the host genome. Its catalytic cycle requires repeated large-scale conformational changes fundamental to its mechanism. Motivated by experimental work, these motions were studied theoretically by the application of normal mode analysis. It was observed that the lowest order modes correlate with largest amplitude (low-frequency) motion, which are most likely to be catalytically relevant. Comparisons between normal modes obtained via an elastic network model to those calculated from the essential dynamics of a series of all-atom molecular dynamics simulations show the self-consistency between these calculations. That similar conformational motions are seen between independent theoretical methods reinforces the importance of large-scale subdomain motion for the biochemical action of DNA polymerases in general. Moreover, it was observed that the major subunits of HIV1 Reverse Transcriptase interact quasi-harmonically. The 5HT3A Serotonin receptor and P2X1 receptor, by contrast, are trans-membrane proteins that function as ligand gated ion channels. Such proteins feature a central pore, which allows for the transit of ions necessary for cellular function across a membrane. The pore is opened by the ligation of binding sites on the extracellular portion of different protein subunits. In an attempt to resolve the individual subunits of these membrane proteins beyond the diffraction

  7. An approach for the reliability based design optimization of laminated composites

    Holdorf Lopez, Rafael; Lemosse, Didier; Souza de Cursi, José Eduardo; Rojas, Jhojan; El-Hami, Abdelkhalak


    This article aims at optimizing laminated composite plates taking into account uncertainties in the structural dimensions. As laminated composites require a global optimization tool, the Particle Swarm Optimization (PSO) method is employed. A new Reliability Based Design Optimization (RBDO) methodology based on safety factors is presented and coupled with PSO. Such safety factors are derived from the Karush-Kuhn-Tucker optimality conditions of the reliability index approach and eliminate the need for reliability analysis in RBDO. The plate weight minimization is the objective function of the optimization process. The results show that the coupling of the evolutionary algorithm with the safety-factor method proposed in this article successfully performs the RBDO of laminated composite structures.

  8. The Use of an Experimental Design Approach to Investigate the ...

    The Use of an Experimental Design Approach to Investigate the Interactions of Additives ... When a conventional starting, lighting and ignition (SLI) lead acid battery is ... Typical flooded nominal 8 Ah test cells were assembled in a reverse ratio ...

  9. Operational reliability evaluation of restructured power systems with wind power penetration utilizing reliability network equivalent and time-sequential simulation approaches

    Ding, Yi; Cheng, Lin; Zhang, Yonghong


    and reserve provides, fast reserve providers and transmission network in restructured power systems. A contingency management schema for real time operation considering its coupling with the day-ahead market is proposed. The time-sequential Monte Carlo simulation is used to model the chronological...... with high wind power penetration. The proposed technique is based on the combination of the reliability network equivalent and time-sequential simulation approaches. The operational reliability network equivalents are developed to represent reliability models of wind farms, conventional generation...... characteristics of corresponding reliability network equivalents. A simplified method is also developed in the simulation procedures for improving the computational efficiency. The proposed technique can be used to evaluate customers’ reliabilities considering high penetration of wind power during the power...

  10. Reliability of objects in aerospace technologies and beyond: Holistic risk management approach

    Shai, Yair; Ingman, D.; Suhir, E.

    A “ high level” , deductive-reasoning-based (“ holistic” ), approach is aimed at the direct analysis of the behavior of a system as a whole, rather than with an attempt to understand the system's behavior by conducting first a “ low level” , inductive-reasoning-based, analysis of the behavior and the contributions of the system's elements. The holistic view on treatment is widely accepted in medical practice, and “ holistic health” concept upholds that all the aspects of people's needs (psychological, physical or social), should be seen as a whole, and that a disease is caused by the combined effect of physical, emotional, spiritual, social and environmental imbalances. Holistic reasoning is applied in our analysis to model the behavior of engineering products (“ species” ) subjected to various economic, marketing, and reliability “ health” factors. Vehicular products (cars, aircraft, boats, etc.), e.g., might be still robust enough, but could be out-of-date, or functionally obsolete, or their further use might be viewed as unjustifiably expensive. High-level-performance functions (HLPF) are the essential feature of the approach. HLPFs are, in effect, “ signatures” of the “ species” of interest. The HLPFs describe, in a “ holistic” , and certainly in a probabilistic, way, numerous complex multi-dependable relations among the representatives of the “ species” under consideration. ; umerous inter-related “ stresses” , both actual (“ physical” ) and nonphysical, which affect the probabilistic predictions are inherently being taken into account by the HLPFs. There is no need, and might even be counter-productive, to conduct tedious, time- and labor-consuming experimentations and to invest significant amount of time and resources to accumulate “ representative statistics” to predict - he governing probabilistic characteristics of the system behavior, such as, e.g., life expectancy of a particular type of products.

  11. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail:


    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature.

  12. An efficient approach to reliability-based design optimization within the enhanced sequential optimization and reliability assessment framework

    Huang, Hong Zhong; Zhang, Xudong; Meng, De Biao; Wang, Zhonglai; Liu, Yu [University of Electronic Science and Technology of China, Chengdu (China)


    Reliability based design optimization (RBDO) has been widely implemented in engineering practices for high safety and reliability. It is an important challenge to improve computational efficiency. Sequential optimization and reliability assessment (SORA) has made great efforts to improve computational efficiency by decoupling a RBDO problem into sequential deterministic optimization and reliability analysis as a single-loop method. In this paper, in order to further improve computational efficiency and extend the application of the current SORA method, an enhanced SORA (ESORA) is proposed by considering constant and varying variances of random design variables while keeping the sequential framework. Some mathematical examples and an engineering case are given to illustrate the proposed method and validate the efficiency.

  13. A genetic algorithm approach for assessing soil liquefaction potential based on reliability method

    M H Bagheripour; I Shooshpasha; M Afzalirad


    Deterministic approaches are unable to account for the variations in soil’s strength properties, earthquake loads, as well as source of errors in evaluations of liquefaction potential in sandy soils which make them questionable against other reliability concepts. Furthermore, deterministic approaches are incapable of precisely relating the probability of liquefaction and the factor of safety (FS). Therefore, the use of probabilistic approaches and especially, reliability analysis is considered since a complementary solution is needed to reach better engineering decisions. In this study, Advanced First-Order Second-Moment (AFOSM) technique associated with genetic algorithm (GA) and its corresponding sophisticated optimization techniques have been used to calculate the reliability index and the probability of liquefaction. The use of GA provides a reliable mechanism suitable for computer programming and fast convergence. A new relation is developed here, by which the liquefaction potential can be directly calculated based on the estimated probability of liquefaction (), cyclic stress ratio (CSR) and normalized standard penetration test (SPT) blow counts while containing a mean error of less than 10% from the observational data. The validity of the proposed concept is examined through comparison of the results obtained by the new relation and those predicted by other investigators. A further advantage of the proposed relation is that it relates and FS and hence it provides possibility of decision making based on the liquefaction risk and the use of deterministic approaches. This could be beneficial to geotechnical engineers who use the common methods of FS for evaluation of liquefaction. As an application, the city of Babolsar which is located on the southern coasts of Caspian Sea is investigated for liquefaction potential. The investigation is based primarily on in situ tests in which the results of SPT are analysed.

  14. Reliable experimental model of hepatic veno-occlusive disease caused by monocrotaline

    Miao-Yan Chen; Jian-Ting Cai; Qin Du; Liang-Jing Wang; Jia-Min Chen; Li-Ming Shao


    BACKGROUND:Hepatic veno-occlusive disease (HVOD) is a severe complication of chemotherapy before hematopoietic stem cell transplantation and dietary ingestion of pyrrolizidine alkaloids. Many experimental models were established to study its mechanisms or therapy, but few are ideal. This work aimed at evaluating a rat model of HVOD induced by monocrotaline to help advance research into this disease. METHODS:Thirty-two male rats were randomly classiifed into 5 groups, and PBS or monocrotaline was administered (100 mg/kg or 160 mg/kg). They were sacriifced on day 7 (groups A, B and D) or day 10 (groups C and E). Blood samples were collected to determine liver enzyme concentrations. The weight of the liver and body and the amount of ascites were measured. Histopathological changes of liver tissue on light microscopy were assessed by a modiifed Deleve scoring system. The positivity of proliferating cell nuclear antigen (PCNA) was estimated. RESULTS:The rats that were treated with 160 mg/kg monocrotaline presented with severe clinical symptoms (including two deaths) and the histopathological picture of HVOD. On the other hand, the rats that were fed with 100 mg/kg monocrotaline had milder and reversible manifestations. Comparison of the rats sacriifced on day 10 with those sacriifced on day 7 showed that the positivity of PCNA increased, especially that of hepatocytes. CONCLUSIONS:Monocrotaline induces acute, dose-dependent HVOD in rats. The model is potentially reversible with a low dose, but reliable and irreversible with a higher dose. The modiifed scoring system seems to be more accurate than the traditional one in relfecting the histopathology of HVOD. The enhancement of PCNA positivity may be associated with hepatic tissue undergoing recovery.

  15. Optimal design of snow avalanche passive defence structure using reliability approach to quantify buildings vulnerability

    Favier, P.; Bertrand, D.; Eckert, N.; Naaim, M.


    To protect elements at risk (humans, roads, houses, etc.) against snow avalanches, civil engineering structures, such as dams or mounds, are used. The design of such defence structures is done following a deterministic approach which considers European regulation. The minimization of expected total losses is an interesting alternative that generalizes cost-benefit approach to a continuous decision variable. For this purpose, not only the hazard magnitude but also the buildings vulnerability must be evaluated carefully. The aim of this work is therefore to combine state of the art sub-models for the probabilistic description of avalanche flows and the numerical evaluation of damages to buildings. We defined the risk as the expectation of the cost consequences of avalanches activity. Disposal consequences are quantified thanks to reliability methods. In this formulation, the accuracy of both the hazard estimation and the vulnerability calculation has to be consistent according to precision and computational costs. To do so, a numerical approach has been developed to evaluate the physical vulnerability of concrete buildings submitted to avalanche loadings. The ensuing application illustrates our approach. A reinforced concrete slab is considered to model the building with a finite element method. Reliability approach enables to produce a response spectrum of the structure against avalanche impact. Finally, vulnerability curves are built. Outcomes of the risk calculation are examined to find sensitivity on the optimal design of snow defence structures.

  16. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Beernink, P; Barsky, D; Pesavento, B


    can be used to understand how an amino acid change affects the protein. The experimental methods that provide the most detailed structural information on proteins are X-ray crystallography and NMR spectroscopy. However, these methods are labor intensive and currently cannot be carried out on a genomic scale. Nonetheless, Structural Genomics projects are being pursued by more than a dozen groups and consortia worldwide and as a result the number of experimentally determined structures is rising exponentially. Based on the expectation that protein structures will continue to be determined at an ever-increasing rate, reliable structure prediction schemes will become increasingly valuable, leading to information on protein function and disease for many different proteins. Given known genetic variability and experimentally determined protein structures, can we accurately predict the effects of single amino acid substitutions? An objective assessment of this question would involve comparing predicted and experimentally determined structures, which thus far has not been rigorously performed. The completed research leveraged existing expertise at LLNL in computational and structural biology, as well as significant computing resources, to address this question.

  17. Reliability Analysis of a 3-Machine Power Station Using State Space Approach

    WasiuAkande Ahmed


    Full Text Available With the advent of high-integrity fault-tolerant systems, the ability to account for repairs of partially failed (but still operational systems become increasingly important. This paper presents a systemic method of determining the reliability of a 3-machine electric power station, taking into consideration the failure rates and repair rates of the individual component (machine that make up the system. A state-space transition process for a 3-machine with 23 states was developed and consequently, steady state equations were generated based on Markov mathematical modeling of the power station. Important reliability components were deduced from this analysis. This research simulation was achieved with codes written in Excel® -VBA programming environment. System reliability using state space approach proofs to be a viable and efficient technique of reliability prediction as it is able to predict the state of the system under consideration. For the purpose of neatness and easy entry of data, Graphic User Interface (GUI was designed.

  18. A Penalized Likelihood Approach to Parameter Estimation with Integral Reliability Constraints

    Barry Smith


    Full Text Available Stress-strength reliability problems arise frequently in applied statistics and related fields. Often they involve two independent and possibly small samples of measurements on strength and breakdown pressures (stress. The goal of the researcher is to use the measurements to obtain inference on reliability, which is the probability that stress will exceed strength. This paper addresses the case where reliability is expressed in terms of an integral which has no closed form solution and where the number of observed values on stress and strength is small. We find that the Lagrange approach to estimating constrained likelihood, necessary for inference, often performs poorly. We introduce a penalized likelihood method and it appears to always work well. We use third order likelihood methods to partially offset the issue of small samples. The proposed method is applied to draw inferences on reliability in stress-strength problems with independent exponentiated exponential distributions. Simulation studies are carried out to assess the accuracy of the proposed method and to compare it with some standard asymptotic methods.

  19. Fast and Reliable Thermodynamic Approach for Determining the Protonation State of the Asp Dyad.

    Huang, Jinfeng; Sun, Bin; Yao, Yuan; Liu, Junjun


    The protonation state of the asp dyad is significantly important in revealing enzymatic mechanisms and developing drugs. However, it is hard to determine by calculating free energy changes between possible protonation states, because the free energy changes due to protein conformational flexibility are usually much larger than those originating from different locations of protons. Sophisticated and computationally expensive methods such as free energy perturbation, thermodynamic integration (TI), and quantum mechanics/molecular mechanics are therefore usually used for this purpose. In the present study, we have developed a simple thermodynamic approach to effectively eliminating the free energy changes arising from protein conformational flexibility and estimating the free energy changes only originated from the locations of protons, which provides a fast and reliable method for determining the protonation state of asp dyads. The test of this approach on a total of 15 asp dyad systems, including BACE-1 and HIV-1 protease, shows that the predictions from this approach are all consistent with experiments or with the computationally expensive TI calculations. It is clear that our thermodynamic approach could be used to rapidly and reliably determine the protonation state of the asp dyad.

  20. A New Approach for Reliability Life Prediction of Rail Vehicle Axle by Considering Vibration Measurement

    Meral Bayraktar


    Full Text Available The effect of vibration on the axle has been considered. Vibration measurements at different speeds have been performed on the axle of a running rail vehicle to figure out displacement, acceleration, time, and frequency response. Based on the experimental works, equivalent stress has been used to find out life of the axles for 90% and 10% reliability. Calculated life values of the rail vehicle axle have been compared with the real life data and it is found that the life of a vehicle axle taking into account the vibration effects is in good agreement with the real life of the axle.

  1. Reliability Assessment of Fuel Cell System - A Framework for Quantitative Approach

    Lee, Shinae; Zhou, Dao; Wang, Huai


    Hydrogen Fuel Cell (FC) technologies have been developed to overcome the operational and environmental challenges associated with using conventional power sources. Telecommunication industry, in particular, has implemented FC systems for the backup power function. The designers and manufacturers...... of such FC systems have great interest in verifying the performance and safety of their systems. Reliability assessment is designated to support decision-making about the optimal design and the operation strategies for FC systems to be commercial viable. This involves the properties of the system...... such as component failures, the system architecture, and operational strategies. This paper suggests an approach that includes Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), and Reliability Block Diagram (RBD). For a case study, and the service lifetime of a commercial 5 kW Proton Exchange...

  2. Ultra-reliable computer systems: an integrated approach for application in reactor safety systems

    Chisholm, G.H.


    Improvements in operation and maintenance of nuclear reactors can be realized with the application of computers in the reactor control systems. In the context of this paper a reactor control system encompasses the control aspects of the Reactor Safety System (RSS). Equipment qualification for application in reactor safety systems requires a rigorous demonstration of reliability. For the purpose of this paper, the reliability demonstration will be divided into two categories. These categories are demonstrations of compliance with respect to (a) environmental; and (b) functional design constrains. This paper presents an approach for the determination of computer-based RSS respective to functional design constraints only. It is herein postulated that the design for compliance with environmental design constraints is a reasonably definitive problem and within the realm of available technology. The demonstration of compliance with design constraints respective to functionality, as described herein, is an extension of available technology and requires development.

  3. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan


    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...

  4. Active subspace approach to reliability and safety assessments of small satellite separation

    Hu, Xingzhi; Chen, Xiaoqian; Zhao, Yong; Tuo, Zhouhui; Yao, Wen


    Ever-increasing launch of small satellites demands an effective and efficient computer-aided analysis approach to shorten the ground test cycle and save the economic cost. However, the multiple influencing factors hamper the efficiency and accuracy of separation reliability assessment. In this study, a novel evaluation approach based on active subspace identification and response surface construction is established and verified. The formulation of small satellite separation is firstly derived, including equations of motion, separation and gravity forces, and quantity of interest. The active subspace reduces the dimension of uncertain inputs with minimum precision loss and a 4th degree multivariate polynomial regression (MPR) using cross validation is hand-coded for the propagation and error analysis. A common spring separation of small satellites is employed to demonstrate the accuracy and efficiency of the approach, which exhibits its potential use in widely existing needs of satellite separation analysis.

  5. Experimental approaches for studying non-equilibrium atmospheric plasma jets

    Shashurin, A., E-mail: [School of Aeronautics & Astronautics, Purdue University, West Lafayette, Indiana 47907 (United States); Keidar, M. [Department of Mechanical and Aerospace Engineering, The George Washington University, Washington, District of Columbia 20052 (United States)


    This work reviews recent research efforts undertaken in the area non-equilibrium atmospheric plasma jets with special focus on experimental approaches. Physics of small non-equilibrium atmospheric plasma jets operating in kHz frequency range at powers around few Watts will be analyzed, including mechanism of breakdown, process of ionization front propagation, electrical coupling of the ionization front with the discharge electrodes, distributions of excited and ionized species, discharge current spreading, transient dynamics of various plasma parameters, etc. Experimental diagnostic approaches utilized in the field will be considered, including Rayleigh microwave scattering, Thomson laser scattering, electrostatic streamer scatterers, optical emission spectroscopy, fast photographing, etc.


    Cristiano Fragassa


    Full Text Available In this paper a Total Quality Management strategy is proposed, refined and used with the aim at improving the quality of large-mass industrial products far beyond the technical specifications demanded at the end-customer level. This approach combines standard and non-standard tools used for Reliability, Availability and Maintainability analysis. The procedure also realizes a stricter correlation between theoretical evaluation methods and experimental evidences as part of a modern integrated method for strengthening quality in design and process. A commercial Intake Manifold, largely spread in the market, is used as test-case for the validation of the methodology. As general additional result, the research underlines the impact of Total Quality Management and its tools on the development of innovation.

  7. A least squares approach for efficient and reliable short-term versus long-term optimization

    Christiansen, Lasse Hjuler; Capolei, Andrea; Jørgensen, John Bagterp


    the balance between the objectives, leaving an unfulfilled potential to increase profits. To promote efficient and reliable short-term versus long-term optimization, this paper introduces a natural way to characterize desirable Pareto points and proposes a novel least squares (LS) method. Unlike hierarchical...... approaches, the method is guaranteed to converge to a Pareto optimal point. Also, the LS method is designed to properly balance multiple objectives, independently of Pareto front’s shape. As such, the method poses a practical alternative to a posteriori methods in situations where the frontier is intractable...

  8. Orotracheal Intubation Using the Retromolar Space: A Reliable Alternative Intubation Approach to Prevent Dental Injury

    Linh T. Nguyen


    Full Text Available Despite recent advances in airway management, perianesthetic dental injury remains one of the most common anesthesia-related adverse events and cause for malpractice litigation against anesthesia providers. Recommended precautions for prevention of dental damage may not always be effective because these techniques involve contact and pressure exerted on vulnerable teeth. We describe a novel approach using the retromolar space to insert a flexible fiberscope for tracheal tube placement as a reliable method to achieve atraumatic tracheal intubation. Written consent for publication has been obtained from the patient.

  9. A New Approach to Provide Reliable Data Systems Without Using Space-Qualified Electronic Components

    Häbel, W.

    This paper describes the present situation and the expected trends with regard to the availability of electronic components, their quality levels, technology trends and sensitivity to the space environment. Many recognized vendors have already discontinued their MIL production line and state of the art components will in many cases not be offered in this quality level because of the shrinking market. It becomes therefore obvious that new methods need to be considered "How to build reliable Data Systems for space applications without High-Rel parts". One of the most promising approaches is the identification, masking and suppression of faults by developing Fault Tolerant Computer systems which is described in this paper.

  10. Reliable multicast for the Grid: a case study in experimental computer science.

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael


    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  11. A New Approach for Improving Reliability of Personal Navigation Devices under Harsh GNSS Signal Conditions

    Gérard Lachapelle


    Full Text Available In natural and urban canyon environments, Global Navigation Satellite System (GNSS signals suffer from various challenges such as signal multipath, limited or lack of signal availability and poor geometry. Inertial sensors are often employed to improve the solution continuity under poor GNSS signal quality and availability conditions. Various fault detection schemes have been proposed in the literature to detect and remove biased GNSS measurements to obtain a more reliable navigation solution. However, many of these methods are found to be sub-optimal and often lead to unavailability of reliability measures, mostly because of the improper characterization of the measurement errors. A robust filtering architecture is thus proposed which assumes a heavy-tailed distribution for the measurement errors. Moreover, the proposed filter is capable of adapting to the changing GNSS signal conditions such as when moving from open sky conditions to deep canyons. Results obtained by processing data collected in various GNSS challenged environments show that the proposed scheme provides a robust navigation solution without having to excessively reject usable measurements. The tests reported herein show improvements of nearly 15% and 80% for position accuracy and reliability, respectively, when applying the above approach.

  12. A new approach for improving reliability of personal navigation devices under harsh GNSS signal conditions.

    Dhital, Anup; Bancroft, Jared B; Lachapelle, Gérard


    In natural and urban canyon environments, Global Navigation Satellite System (GNSS) signals suffer from various challenges such as signal multipath, limited or lack of signal availability and poor geometry. Inertial sensors are often employed to improve the solution continuity under poor GNSS signal quality and availability conditions. Various fault detection schemes have been proposed in the literature to detect and remove biased GNSS measurements to obtain a more reliable navigation solution. However, many of these methods are found to be sub-optimal and often lead to unavailability of reliability measures, mostly because of the improper characterization of the measurement errors. A robust filtering architecture is thus proposed which assumes a heavy-tailed distribution for the measurement errors. Moreover, the proposed filter is capable of adapting to the changing GNSS signal conditions such as when moving from open sky conditions to deep canyons. Results obtained by processing data collected in various GNSS challenged environments show that the proposed scheme provides a robust navigation solution without having to excessively reject usable measurements. The tests reported herein show improvements of nearly 15% and 80% for position accuracy and reliability, respectively, when applying the above approach.

  13. Pruning Chinese trees : an experimental and modelling approach

    Zeng, Bo


    Pruning of trees, in which some branches are removed from the lower crown of a tree, has been extensively used in China in silvicultural management for many purposes. With an experimental and modelling approach, the effects of pruning on tree growth and on the harvest of plant material were studied.

  14. Alternatives for Mixed-Effects Meta-Regression Models in the Reliability Generalization Approach: A Simulation Study

    López-López, José Antonio; Botella, Juan; Sánchez-Meca, Julio; Marín-Martínez, Fulgencio


    Since heterogeneity between reliability coefficients is usually found in reliability generalization studies, moderator analyses constitute a crucial step for that meta-analytic approach. In this study, different procedures for conducting mixed-effects meta-regression analyses were compared. Specifically, four transformation methods for the…

  15. Parametric and experimental analysis using a power flow approach

    Cuschieri, J. M.


    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  16. An Innovative Approach for Improving the Reliability of Reticulated Porous Ceramics


    An innovative approach has been developed to fabricate reticulated porous ceramics (RPCs) with uniform macrostruc-ture by using the polymeric sponge as the templates. In this approach, the coating process comprises of two stages,In the first stage, the thicker slurry was used to coat uniformly the sponge substrate. The green body was preheatedto produce a reticulated preform with enough handling strength after the sponge was burned out. In the second stage,the thinner slurry was used to coat uniformly the preform. The population of the microscopic and macroscopic flawsin the structure is reduced significantly by recoating process. A few filled cells and cell faces occur in the fabricationand the struts were thickened. A statistical evaluation by means of Weibull statistics was carried out on the bendstrength data of RPCs, which were prepared by the traditional approach and innovative approach, respectively. Theresult shows that the mechanical reliability of RPCs is improved by the innovative approach. This innovative approachis very simple and controlled easily, and will open up new technological applications for RPCs.

  17. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications.

    Emery, John M; Coffin, Peter; Robbins, Brian A; Carroll, Jay; Field, Richard V.,; Yung Suk Jeremy Yoo; Josh Kacher


    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins with a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.

  18. A New Sampling Approach for Response Surface Method Based Reliability Analysis and Its Application

    Xin-Jia Meng


    Full Text Available A response surface method based on the all sample point interpolation approach (ASPIA is proposed to improve the efficiency of reliability computation. ASPIA obtains new sample points through linear interpolation. These new sample points are occasionally extremely dense, thus easily generating an ill-conditioned problem for approximation functions. A mobile most probable failure point strategy is used to solve this problem. The advantage of the proposed method is proven by two numerical examples. With the ASPIA, the approximated process can rapidly approach the actual limit equation with accuracy. Additionally, there are no difficulties in applying the proposed ASPIA to the response surface model with or without cross terms. Solution results for the numerical examples indicate that the use of the response surface function with cross terms increases time cost, but result accuracy cannot be improved significantly. Finally, the proposed method is successfully used to analyze the reliability of the hydraulic cylinder of a forging hydraulic press by combining MATLAB and ANSYS software. The engineering example confirms the practicality of this method.

  19. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)


    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  20. [Experimental studies of reliability of symbolic information perception from the aviation LCD panel].

    Ivanov, A I; Lapa, V V; Davydov, V V; Riabinin, V A; Golosov, S Iu


    Results of tachistiscopic experiments on reliability of symbol recognition on LCD panel as a function of screen definition (640 x 480, 800 x 600 and 1024 x 768 pixels), angular size of a picture element (10, 15, 20 and 30 angular min) and luminance contrast (LC) with the background (0.2 to 1.4 standard units) are presented. The obtained quantitative relations indicate significance of the above parameters for recognition reliability. Symbols with the size of 30 angular min and LC of 0.5 were recognizable irrespective of screen definition in the study. Recognition of symbols 20 and 15 angular min of size was much dependent on screen definition and LC of symbols. For symbols of size 10 angular min and LC ≥ 1.0 the recognition probability did not exceed 0.59-0.7.

  1. Reliable Selection and Holistic Stability Evaluation of Reference Genes for Rice Under 22 Different Experimental Conditions.

    Wang, Zhaohai; Wang, Ya; Yang, Jing; Hu, Keke; An, Baoguang; Deng, Xiaolong; Li, Yangsheng


    Stable and uniform expression of reference genes across samples plays a key role in accurate normalization of gene expression by reverse-transcription quantitative polymerase chain reaction (RT-qPCR). For rice study, there is still a lack of validation and recommendation of appropriate reference genes with high stability depending on experimental conditions. Eleven candidate reference genes potentially owning high stability were evaluated by geNorm and NormFinder for their expression stability in 22 various experimental conditions. Best combinations of multiple reference genes were recommended depending on experimental conditions, and the holistic stability of reference genes was also evaluated. Reference genes would become more variable and thus needed to be critically selected in experimental groups of tissues, heat, 6-benzylamino purine, and drought, but they were comparatively stable under cold, wound, and ultraviolet-B stresses. Triosephosphate isomerase (TI), profilin-2 (Profilin-2), ubiquitin-conjugating enzyme E2 (UBC), endothelial differentiation factor (Edf), and ADP-ribosylation factor (ARF) were stable in most of our experimental conditions. No universal reference gene showed good stability in all experimental conditions. To get accurate expression result, suitable combination of multiple reference genes for a specific experimental condition would be a better choice. This study provided an application guideline to select stable reference genes for rice gene expression study.

  2. Managing Cybersecurity Research and Experimental Development: The REVO Approach

    Dan Craigen


    Full Text Available We present a systematic approach for managing a research and experimental development cybersecurity program that must be responsive to continuously evolving cybersecurity, and other, operational concerns. The approach will be of interest to research-program managers, academe, corporate leads, government leads, chief information officers, chief technology officers, and social and technology policy analysts. The approach is compatible with international standards and procedures published by the Organisation for Economic Co-operation and Development (OECD and the Treasury Board of Canada Secretariat (TBS. The key benefits of the approach are the following: i the breadth of the overall (cybersecurity space is described; ii depth statements about specific (cybersecurity challenges are articulated and mapped to the breadth of the problem; iii specific (cybersecurity initiatives that have been resourced through funding or personnel are tracked and linked to specific challenges; and iv progress is assessed through key performance indicators. Although we present examples from cybersecurity, the method may be transferred to other domains. We have found the approach to be rigorous yet adaptive to change; it challenges an organization to be explicit about the nature of its research and experimental development in a manner that fosters alignment with evolving business priorities, knowledge transfer, and partner engagement.

  3. A modelling approach to find stable and reliable soil organic carbon values for further regionalization.

    Bönecke, Eric; Franko, Uwe


    Soil organic matter (SOM) and carbon (SOC) might be the most important components to describe soil fertility of agricultural used soils. It is sensitive to temporal and spatial changes due to varying weather conditions, uneven crops and soil management practices and still struggles with providing reliable delineation of spatial variability. Soil organic carbon, furthermore, is an essential initial parameter for dynamic modelling, understanding e.g. carbon and nitrogen processes. Alas it requires cost and time intensive field and laboratory work to attain and using this information. The objective of this study is to assess an approach that reduces efforts of laboratory and field analyses by using method to find stable initial soil organic carbon values for further soil process modelling and regionalization on field scale. The demand of strategies, technics and tools to improve reliable soil organic carbon high resolution maps and additionally reducing cost constraints is hence still facing an increasing attention of scientific research. Although, it is nowadays a widely used practice, combining effective sampling schemes with geophysical sensing techniques, to describe within-field variability of soil organic carbon, it is still challenging large uncertainties, even at field scale in both, science and agriculture. Therefore, an analytical and modelling approach might facilitate and improve this strategy on small and large field scale. This study will show a method, how to find reliable steady state values of soil organic carbon at particular points, using the approved soil process model CANDY (Franko et al. 1995). It is focusing on an iterative algorithm of adjusting the key driving components: soil physical properties, meteorological data and management information, for which we quantified the input and the losses of soil carbon (manure, crop residues, other organic inputs, decomposition, leaching). Furthermore, this approach can be combined with geophysical

  4. Features of applying systems approach for evaluating the reliability of cryogenic systems for special purposes

    E. D. Chertov


    Full Text Available Summary. The analysis of cryogenic installations confirms objective regularity of increase in amount of the tasks solved by systems of a special purpose. One of the most important directions of development of a cryogenics is creation of installations for air separation product receipt, namely oxygen and nitrogen. Modern aviation complexes require use of these gases in large numbers as in gaseous, and in the liquid state. The onboard gas systems applied in aircraft of the Russian Federation are subdivided on: oxygen system; air (nitric system; system of neutral gas; fire-proof system. Technological schemes ADI are in many respects determined by pressure of compressed air or, in a general sense, a refrigerating cycle. For the majority ADI a working body of a refrigerating cycle the divided air is, that is technological and refrigerating cycles in installation are integrated. By this principle differentiate installations: low pressure; average and high pressure; with detander; with preliminary chilling. There is also insignificant number of the ADI types in which refrigerating and technological cycles are separated. These are installations with external chilling. For the solution of tasks of control of technical condition of the BRV hardware in real time and estimates of indicators of reliability it is offered to use multi-agent technologies. Multi-agent approach is the most acceptable for creation of SPPR for reliability assessment as allows: to redistribute processing of information on elements of system that leads to increase in overall performance; to solve a problem of accumulating, storage and recycling of knowledge that will allow to increase significantly efficiency of the solution of tasks of an assessment of reliability; to considerably reduce intervention of the person in process of functioning of system that will save time of the person of the making decision (PMD and will not demand from it special skills of work with it.

  5. Commutative and Non-commutative Parallelogram Geometry: an Experimental Approach

    Bertram, Wolfgang


    By "parallelogram geometry" we mean the elementary, "commutative", geometry corresponding to vector addition, and by "trapezoid geometry" a certain "non-commutative deformation" of the former. This text presents an elementary approach via exercises using dynamical software (such as geogebra), hopefully accessible to a wide mathematical audience, from undergraduate students and high school teachers to researchers, proceeding in three steps: (1) experimental geometry, (2) algebra (linear algebr...

  6. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    Chu, Tsong-Lun [Brookhaven National Lab. (BNL), Upton, NY (United States); Varuttamaseni, Athi [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, Joo-Seok [Brookhaven National Lab. (BNL), Upton, NY (United States)


    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan1 for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRA's of nuclear power plants (NPPs), and, (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC?s sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware2, development of a philosophical basis for defining software failure3, and identification of desirable attributes of quantitative software reliability methods4 7044. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability.

  7. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata


    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  8. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher


    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  9. Is orthopantomography reliable for TMJ diagnosis? An experimental study on a dry skull.

    Ruf, S; Pancherz, H


    The accuracy of orthopantomography in reproducing the temporomandibular joint area was analyzed on a dry skull. The results based on this study of a single skull revealed that the radiographic image of the temporomandibular joint did not correspond to the anatomic condylar and fossa components or to their actual relationship. To a large extent, changes in skull position affected the radiographic temporomandibular joint image, simulating anterior condylar flattening, osteophytes, narrowing of joint space, and left/right condylar asymmetry. Orthopantomography may have questionable reliability for temporomandibular joint diagnostic purposes.

  10. Enforsing a system approach to composite failure criteria for reliability analysis

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian


    parameters are random, multiple failure modes may be identified which will jeopardize the FORM analysis and a system approach should be applied to assure a correct analysis. Although crude Monte Carlo simulation automatically may account for such effects, time constraints limit its useability in problems...... challenges with the use of failure criteria, since composite materials are a discontinuous medium, which invoke multiple failure modes. Under deterministic conditions the material properties and the stress vector are constant and will result in a single dominating failure mode. When any of these input...... involving advanced FEM models. When applying more computationally efficient methods based on FORM/SORM it is important to carefully account for the multiple failure modes described by the failure criterion. The present paper discusses how to handle this problem and presents examples where reliability...

  11. On the reliability of NMR relaxation data analyses: a Markov Chain Monte Carlo approach.

    Abergel, Daniel; Volpato, Andrea; Coutant, Eloi P; Polimeno, Antonino


    The analysis of NMR relaxation data is revisited along the lines of a Bayesian approach. Using a Markov Chain Monte Carlo strategy of data fitting, we investigate conditions under which relaxation data can be effectively interpreted in terms of internal dynamics. The limitations to the extraction of kinetic parameters that characterize internal dynamics are analyzed, and we show that extracting characteristic time scales shorter than a few tens of ps is very unlikely. However, using MCMC methods, reliable estimates of the marginal probability distributions and estimators (average, standard deviations, etc.) can still be obtained for subsets of the model parameters. Thus, unlike more conventional strategies of data analysis, the method avoids a model selection process. In addition, it indicates what information may be extracted from the data, but also what cannot.

  12. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    Sergi, Pier Nicola, E-mail: [Translational Neural Engineering Laboratory, The Biorobotics Institute, Scuola Superiore Sant' Anna, Viale Rinaldo Piaggio 34, Pontedera, 56025 (Italy); Jensen, Winnie [Department of Health Science and Technology, Fredrik Bajers Vej 7, 9220 Aalborg (Denmark); Yoshida, Ken [Department of Biomedical Engineering, Indiana University - Purdue University Indianapolis, 723 W. Michigan St., SL220, Indianapolis, IN 46202 (United States)


    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature.

  13. Systematic review of survival time in experimental mouse stroke with impact on reliability of infarct estimation

    Klarskov, Carina Kirstine; Klarskov, Mikkel Buster; Hasseldam, Henrik


    Background: Stroke is the second most common cause of death worldwide. Only one treatment for acute ischemic stroke is currently available, thrombolysis with rt-PA, but it is limited in its use. Many efforts have been invested in order to find additive treatments, without success.A multitude...... of reasons for the translational problems from mouse experimental stroke to clinical trials probably exists, including infarct size estimations around the peak time of edema formation. Furthermore, edema is a more prominent feature of stroke in mice than in humans, because of the tendency to produce larger...... infarcts with more substantial edema. Purpose: This paper will give an overview of previous studies of experimental mouse stroke, and correlate survival time to peak time of edema formation. Furthermore, investigations of whether the included studies corrected the infarct measurements for edema...

  14. Assessment of buckling-restrained braced frame reliability using an experimental limit-state model and stochastic dynamic analysis

    Andrews, Blake M.; Song, Junho; Fahnestock, Larry A.


    Buckling-restrained braces (BRBs) have recently become popular in the United States for use as primary members of seismic lateral-force-resisting systems. A BRB is a steel brace that does not buckle in compression but instead yields in both tension and compression. Although design guidelines for BRB applications have been developed, systematic procedures for assessing performance and quantifying reliability are still needed. This paper presents an analytical framework for assessing buckling-restrained braced frame (BRBF) reliability when subjected to seismic loads. This framework efficiently quantifies the risk of BRB failure due to low-cycle fatigue fracture of the BRB core. The procedure includes a series of components that: (1) quantify BRB demand in terms of BRB core deformation histories generated through stochastic dynamic analyses; (2) quantify the limit-state of a BRB in terms of its remaining cumulative plastic ductility capacity based on an experimental database; and (3) evaluate the probability of BRB failure, given the quantified demand and capacity, through structural reliability analyses. Parametric studies were conducted to investigate the effects of the seismic load, and characteristics of the BRB and BRBF on the probability of brace failure. In addition, fragility curves (i.e., conditional probabilities of brace failure given ground shaking intensity parameters) were created by the proposed framework. While the framework presented in this paper is applied to the assessment of BRBFs, the modular nature of the framework components allows for application to other structural components and systems.

  15. Assessment of buckling-restrained braced frame reliability using an experimental limit-state model and stochastic dynamic analysis

    Blake M. Andrews; Junho Song; Larry A. Fahnestock


    Buckling-restrained braces (BRBs) have recently become popular in the United :States for use as primary members of seismic lateral-force-resisting systems. A BRB is a steel brace that does not buckle in compression but instead yields in both tension and compression. Although design guidelines for BRB applications have been developed, systematic procedures for assessing performance and quantifying reliability are still needed. This paper presents an analytical framework for assessing buckling-restrained braced frame (BRBF) reliability when subjected to seismic loads. This framework efficiently quantifies the risk of BRB failure due to low-cycle fatigue fracture of the BRB core. The procedure includes a series of components that: (1) quantify BRB demand in terms of BRB core deformation histories generated through stochastic dynamic analyses; (2) quantify the limit-state of a BRB in terms of its remaining cumulative plastic ductility capacity based on an experimental database; and (3) evaluate the probability of BRB failure, given the quantified demand and capacity, through structural reliability analyses. Parametric studies were conducted to investigate the effects of the seismic load, and characteristics of the BRB and BRBF on the probability of brace failure. In addition, fragility curves (i.e., conditional probabilities of brace failure given ground shaking intensity parameters) were created by the proposed framework. While the framework presented in this paper is applied to the assessment of BRBFs, the modular nature of the framework components allows for application to other structural components and systems.


    I. V. Kudriavtsev


    Full Text Available Abstract. This review is focused on analysis of currently used flow cytometric methods designed foridentifying apoptotic cells in various invertebrate and vertebrate species. Apoptosis can be characterized by stage-specific morphological and biochemical changes that are typical to all kinds of eukaryotic cells. In this article, we consider different techniques of apoptosis detection based on assessment of cellular morphology and plasma membrane alterations, activation of intracellular enzymes and components of a caspase cascade, as well as DNA fragmentation and failure of mitochondrial transmembrane potential, as assessed in various animal groups. Apoptosis recognized as a key mechanism aiming at maintenance of cellular homeostasis in multicellular organisms, and such investigations represent a necessary component of fundamental and applied studies in diverse fields of experimental biology and immunology. A broad spectrum of apoptosis markers isused, and the preference is given to optimal approaches, as determined by experimental tasks, and technical opportunities of the laboratory.

  17. How Random Is Quantum Randomness? An Experimental Approach

    Calude, Cristian S; Dumitrescu, Monica; Svozil, Karl


    Our aim is to experimentally study the possibility of distinguishing between quantum sources of randomness--recently proved to be theoretically incomputable--and some well-known computable sources of pseudo-randomness. Incomputability is a necessary, but not sufficient "symptom" of "true randomness". We base our experimental approach on algorithmic information theory which provides characterizations of algorithmic random sequences in terms of the degrees of incompressibility of their finite prefixes. Algorithmic random sequences are incomputable, but the converse implication is false. We have performed tests of randomness on pseudo-random strings (finite sequences) of length $2^{32}$ generated with software (Mathematica, Maple), which are cyclic (so, strongly computable), the bits of $\\pi$, which is computable, but not cyclic, and strings produced by quantum measurements (with the commercial device Quantis and by the Vienna IQOQI group). Our empirical tests indicate quantitative differences, some statisticall...

  18. An Approach to Online Reliability Evaluation and Prediction of Mechanical Transmission Components

    Matthias Maisch; Bernd Bertsche; Ralf Hettich


    New development trends in electronic operating data logging systems enable classification, recording and storage of load spectrums of mechanical transmission components during usage. Based on this fact, the application of online reliability evaluation and reliability prediction procedures are presented. Different methods are considered to calculate reliability, depending on actual load spectrum and a Wohler curve. The prediction of a reliability trend is analyzed by the application of time series models. For this purpose, exponential smoothing model, regression model, and the ARIMA model are considered to evaluate data and predict an decreasing reliability trends during usage.

  19. A Bayesian experimental design approach to structural health monitoring

    Farrar, Charles [Los Alamos National Laboratory; Flynn, Eric [UCSD; Todd, Michael [UCSD


    Optimal system design for SHM involves two primarily challenges. The first is the derivation of a proper performance function for a given system design. The second is the development of an efficient optimization algorithm for choosing a design that maximizes, or nearly maximizes the performance function. In this paper we will outline how an SHM practitioner can construct the proper performance function by casting the entire design problem into a framework of Bayesian experimental design. The approach demonstrates how the design problem necessarily ties together all steps of the SHM process.

  20. Skill and reliability of experimental GEFS ensemble forecast guidance designed to inform decision-making in reservoir management in California

    Scheuerer, Michael; Webb, Robert S.; Hamill, Thomas M.


    Many reservoirs operated by the U.S. Army Corps of Engineers (Corps) in California provide flood control as well as water supply, recreation and stream flow regulation. Operations for flood control follow seasonally specified elevations for an upper volume of reservoir storage with unused storage capacity designated for flood risk management and thus not available for water supply storage. In the flood control operation of these reservoirs, runoff is captured during rain events and then released soon after at rates that do not result in downstream flooding (typically over a 5 to 8 day period), resulting in evacuated storage space to capture runoff from the next potential storm. As part of the Forecast-Informed Reservoir Operations (FIRO) partnership to more effectively balance flood and drought risks, we developed an experimental California medium-range precipitation forecast system based on NCEP GEFS reforecasts and Climatology-Calibrated Precipitation Analysis (CCPA). We have applied this experimental forecast system to predict the probability of day 5-10 precipitation accumulations at each CCPA grid point within California to exceed certain pre-specified thresholds. Discussions with flood and water supply managers indicate that forecast guidance for the very low risk of extreme precipitation for watersheds above reservoirs can be valuable for decision making. In this study, we assess the skill and reliability of this experimental forecast system to predict low probabilities of precipitation extreme events for select watersheds during recent winter precipitation seasons. Our analysis indicate there may be sufficient reliability in forecasts guidance for low probabilities of heavy precipitation events to inform decision making in reservoir management in select California river basins to manage flood risk while increasing water supply for consumptive use and ecosystem services.

  1. An approach to the design of a simple, radiation hard, highly integrated and reliable, lightweight satellite bus

    Nock, K. T.; Salazar, R. P.; Buck, C.; Eisenman, D.


    A spacecraft system concept has been developed which uses ion propulsion and a small, simple, lightweight design approach to carry out a focused lunar science mission. The design relies on existing technology to achieve high reliability and the possibility of a near-term launch. Key characteristics of the mission are highly focused objectives, low launch costs, minimization of development risks and design simplicity. The spacecraft for the lunar mission is described with particular emphasis on the design and reliability approaches used. The degree of adaptability of this basic class of spacecraft bus is discussed with particular emphasis on its application as a technology testbed vehicle. Several potential mission applications are reviewed.

  2. Probabilistic Approach to System Reliability of Mechanism with Correlated Failure Models

    Xianzhen Huang


    Full Text Available In this paper, based on the kinematic accuracy theory and matrix-based system reliability analysis method, a practical method for system reliability analysis of the kinematic performance of planar linkages with correlated failure modes is proposed. The Taylor series expansion is utilized to derive a general expression of the kinematic performance errors caused by random variables. A proper limit state function (performance function for reliability analysis of the kinematic performance of planar linkages is established. Through the reliability theory and the linear programming method the upper and lower bounds of the system reliability of planar linkages are provided. In the course of system reliability analysis, the correlation of different failure modes is considered. Finally, the practicality, efficiency, and accuracy of the proposed method are shown by a numerical example.

  3. A new approach to real-time reliability analysis of transmission system using fuzzy Markov model

    Tanrioven, M.; Kocatepe, C. [University of Yildiz Technical, Istanbul (Turkey). Dept. of Electrical Engineering; Wu, Q.H.; Turner, D.R.; Wang, J. [Liverpool Univ. (United Kingdom). Dept. of Electrical Engineering and Economics


    To date the studies of power system reliability over a specified time period have used average values of the system transition rates in Markov techniques. [Singh C, Billinton R. System reliability modeling and evaluation. London: Hutchison Educational; 1977]. However, the level of power systems reliability varies from time to time due to weather conditions, power demand and random faults [Billinton R, Wojczynski E. Distributional variation of distribution system reliability indices. IEEE Trans Power Apparatus Systems 1985; PAS-104(11):3152-60]. It is essential to obtain an estimate of system reliability under all environmental and operating conditions. In this paper, fuzzy logic is used in the Markov model to describe both transition rates and temperature-based seasonal variations, which identifies multiple weather conditions such as normal, less stormy, very stormy, etc. A three-bus power system model is considered to determine the variation of system reliability in real-time, using this newly developed fuzzy Markov model (FMM). The results cover different aspects such as daily and monthly reliability changes during January and August. The reliability of the power transmission system is derived as a function of augmentation in peak load level. Finally the variation of the system reliability with weather conditions is determined. (author)

  4. An approach to verification and validation of a reliable multicasting protocol: Extended Abstract

    Callahan, John R.; Montgomery, Todd L.


    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this

  5. Reliability of core test – Critical assessment and proposed new approach

    Shafik Khoury


    Full Text Available Core test is commonly required in the area of concrete industry to evaluate the concrete strength and sometimes it becomes the unique tool for safety assessment of existing concrete structures. Core test is therefore introduced in most codes. An extensive literature survey on different international codes’ provisions; including the Egyptian, British, European and ACI Codes, for core analysis is presented. All studied codes’ provisions seem to be unreliable for predicting the in-situ concrete cube strength from the results of core tests. A comprehensive experimental study was undertaken to examine the factors affecting the interpretation of core test results. The program involves four concrete mixes, three concrete grades (18, 30 and 48 MPa, five core diameters (1.5, 2, 3, 4 and 6 in., five core aspect ratios (between 1 and 2, two types of coarse aggregates (pink lime stone and gravel, two coring directions, three moisture conditions and 18 different steel arrangements. Prototypes for concrete slabs and columns were constructed. More than 500 cores were prepared and tested in addition to tremendous number of concrete cubes and cylinders. Results indicate that the core strength reduces with the increase in aspect ratio, the reduction in core diameter, the presence of reinforcing steel, the incorporation of gravel in concrete, the increase in core moisture content, the drilling perpendicular to casting direction, and the reduction in concrete strength. The Egyptian code provision for core interpretation is critically examined. Based on the experimental evidences throughout this study, statistical analysis has been performed to determine reliable strength correction factors that account for the studied variables. A simple weighted regression analysis of a model without an intercept was carried out using the “SAS Software” package as well as “Data Fit” software. A new model for interpretation of core test results is proposed considering

  6. An Evidence Theoretic Approach to Design of Reliable Low-Cost UAVs


    example, work by Sobieszczanski-Sobieski et al. [31], Giunta et al. [12], and Peoples & Willcox [26]. Reliability based design optimization 10 ( RBDO ...has been proposed to incorporate failure modes and component (un)reliability into the optimization process [1]. MDO and RBDO are popular among

  7. Management of reliability and maintainability; a disciplined approach to fleet readiness

    Willoughby, W. J., Jr.


    Material acquisition fundamentals were reviewed and include: mission profile definition, stress analysis, derating criteria, circuit reliability, failure modes, and worst case analysis. Military system reliability was examined with emphasis on the sparing of equipment. The Navy's organizational strategy for 1980 is presented.

  8. Analyzing patterns in experts' approaches to solving experimental problems

    Čančula, Maja Poklinek; Planinšič, Gorazd; Etkina, Eugenia


    We report detailed observations of three pairs of expert scientists and a pair of advanced undergraduate students solving an experimental optics problem. Using a new method ("transition graphs") of visualizing sequences of logical steps, we were able to compare the groups and identify patterns that could not be found using previously existing methods. While the problem solving of undergraduates significantly differed from that of experts at the beginning of the process, it gradually became more similar to the expert problem solving. We mapped problem solving steps and their sequence to the elements of an approach to teaching and learning physics called Investigative Science Learning Environment (ISLE), and we speculate that the ISLE educational framework closely represents the actual work of physicists.

  9. Fourier transform approach in modulation technique of experimental measurements.

    Khazimullin, M V; Lebedev, Yu A


    An application of Fourier transform approach in modulation technique of experimental studies is considered. This method has obvious advantages compared with traditional lock-in amplifiers technique--simple experimental setup, a quickly available information on all the required harmonics, high speed of data processing using fast Fourier transform algorithm. A computationally simple, fast and accurate Fourier coefficients interpolation (FCI) method has been implemented to obtain a useful information from harmonics of a multimode signal. Our analysis shows that in this case FCI method has a systematical error (bias) of a signal parameters estimation, which became essential for the short data sets. Hence, a new differential Fourier coefficients interpolation (DFCI) method has been suggested, which is less sensitive to a presence of several modes in a signal. The analysis has been confirmed by simulations and measurements of a quartz wedge birefringence by means of the photoelastic modulator. The obtained bias, noise level, and measuring speed are comparable and even better than in lock-in amplifier technique. Moreover, presented DFCI method is expected to be promised candidate for using in actively developing imaging systems based on the modulation technique requiring fast digital signal processing of large data sets.

  10. A reliable approach to distinguish between transient with and without HFOs using TQWT and MCA.

    Chaibi, Sahbi; Lajnef, Tarek; Sakka, Zied; Samet, Mounir; Kachouri, Abdennaceur


    Recent studies have reported that discrete high frequency oscillations (HFOs) in the range of 80-500Hz may serve as promising biomarkers of the seizure focus in humans. Visual scoring of HFOs is tiring, time consuming, highly subjective and requires a great deal of mental concentration. Due to the recent explosion of HFOs research, development of a robust automated detector is expected to play a vital role in studying HFOs and their relationship to epileptogenesis. Therefore, a handful of automated detectors have been introduced in the literature over the past few years. In fact, all the proposed methods have been associated with high false-positive rates, which essentially arising from filtered sharp transients like spikes, sharp waves and artifacts. In order to specifically minimize false positive rates and improve the specificity of HFOs detection, we proposed a new approach, which is a combination of tunable Q-factor wavelet transform (TQWT), morphological component analysis (MCA) and complex Morlet wavelet (CMW). The main findings of this study can be summarized as follows: The proposed method results in a sensitivity of 96.77%, a specificity of 85.00% and a false discovery rate (FDR) of 07.41%. Compared to this, the classical CMW method applied directly on the signals without pre-processing by TQWT-MCA achieves a sensitivity of 98.71%, a specificity of 18.75%, and an FDR of 29.95%. The proposed method may be considered highly accurate to distinguish between transients with and without HFOs. Consequently, it is remarkably reliable and robust for the detection of HFOs.

  11. Fatigue reliability analysis of fixed offshore structures:A first passage problem approach



    This paper describes a methodology for computation of reliability of members of fixed offshore platform structures,with respect to fatigue. Failure criteria were formulated using fracture mechanics principle. The problem is coined as a "first passage problem". The method was illustrated through application to a typical plane frame structure. The fatigue reliability degradation curve established can be used for planning in-service inspection of offshore platforms. A very limited parametric study was carried out to obtain insight into the effect of important variables on the fatigue reliability.

  12. An experimental psychophysiological approach to human bradycardiac reflexes.

    Furedy, J J


    Bradycardic reflexes in man are both of scientific and clinical interest. Using the methods of experimental psychophysiology, control over relevant independent variables permits the study of fine-grained temporal physiologic response topographies, and of psychological factors that may modify the reflex. In addition, information can also be sought through interdisciplinary collaborations with experimental physiologists in order to shed light on the mechanism of the reflexes. These general features of the approach are illustrated by presenting data on two bradycardic reflex preparations: the laboratory dive analog, and the 90-degree negative tilt. The dive-analog studies have shown that a) the dive-reflex proper is a late-occurring bradycardia accompanied by a late-occurring vasoconstriction; and b) for the elicitation of this reflex, both breath-holding and face immersion are necessary. In addition, the physiologic manipulation of temperature affects the reflex in an inverse way over the range of 10 degrees to 40 degrees C, while the sense of control (a psychological variable) attenuates the reflex. The negative-tilt preparation produces a bradycardic response that is ideal as a Pavlovian unconditional response. Some Pavlovian conditioning arrangements, especially an "imaginational" form, do produce significant conditional bradycardic responding, and this has both potential clinical (e.g., biofeedback-related) and theoretical (e.g., S-R vs. S-S accounts of Pavlovian conditioning) applications. The paper ends with a comment on the cognitive paradigm shift in psychology. Although this shift is of importance, it is suggested that it is also important to "remember the response."

  13. Science and society: different bioethical approaches towards animal experimentation.

    Brom, Frans W A


    The use of live animals for experiments plays an important role in many forms of research. This gives rise to an ethical dilemma. On the one hand, most of the animals used are sentient beings who may be harmed by the experiments. The research, on the other hand, may be vital for preventing, curing or alleviating human diseases. There is no consensus on how to tackle this dilemma. One extreme is the view taken by adherents of the so-called animal rights view. According to this view, we are never justified in harming animals for human purposes - however vital these purposes may be. The other extreme is the ruthless view, according to which animals are there to be used at our discretion. However, most people have a view situated somewhere between these two extremes. It is accepted that animals may be used for research - contrary to the animal rights view. However, contrary to the ruthless view, that is only accepted under certain conditions. The aim of this presentation is to present different ethical views which may serve as a foundation for specifying the circumstances under which it is acceptable to use animals for research. Three views serving this role are contractarianism, utilitarianism and a deontological approach. According to contractarianism, the key ethical issue is concern for the sentiments of other human beings in society, on whose co-operation those responsible for research depend. Thus it is acceptable to use animals as long as most people can see the point of the experiment and are not offended by the way it is done. According to utilitarianism, the key ethical issue is about the consequences for humans and animals. Thus it is justified to use animals for research if enough good comes out of it in terms of preventing suffering and creating happiness, and if there is no better alternative. In the deontological approach the prima facie duty of beneficence towards human beings has to be weighed against the prima facie duties not to harm animals and to

  14. Assessing the reliability of amplified RNA used in microarrays: a DUMB table approach.

    Bearden, Edward D; Simpson, Pippa M; Peterson, Charlotte A; Beggs, Marjorie L


    A certain minimal amount of RNA from biological samples is necessary to perform a microarray experiment with suitable replication. In some cases, the amount of RNA available is insufficient, necessitating RNA amplification prior to target synthesis. However, there is some uncertainty about the reliability of targets that have been generated from amplified RNA, because of nonlinearity and preferential amplification. This current work develops a straightforward strategy to assess the reliability of microarray data obtained from amplified RNA. The tabular method we developed, which utilises a Down-Up-Missing-Below (DUMB) classification scheme, shows that microarrays generated with amplified RNA targets are reliable within constraints. There was an increase in false negatives because of the need for increased filtering. Furthermore, this analysis method is generic and can be broadly applied to evaluate all microarray data. A copy of the Microsoft Excel spreadsheet is available upon request from Edward Bearden.

  15. Force-based and displacement-based reliability assessment approaches for highway bridges under multiple hazard actions

    Chao Huang


    Full Text Available The strength limit state of American Association of State Highway and Transportation Officials (AASHTO Load and Resistance Factor Design (LRFD Bridge Design Specifications is developed based on the failure probabilities of the combination of non-extreme loads. The proposed design limit state equation (DLSE has been fully calibrated for dead load and live load by using the reliability-based approach. On the other hand, most of DLSEs in other limit states, including the extreme events Ⅰ and Ⅱ, have not been developed and calibrated though taking certain probability-based concepts into account. This paper presents an assessment procedure of highway bridge reliabilities under the limit state of extreme event Ⅰ, i. e., the combination of dead load, live load and earthquake load. A force-based approach and a displacement-based approach are proposed and implemented on a set of nine simplified bridge models. Results show that the displacement-based approach comes up with more convergent and accurate reliabilities for selected models, which can be applied to other hazards.

  16. Teachers' implicit personality theories about the gifted: an experimental approach.

    Baudson, Tanja Gabriele; Preckel, Franzis


    The implicit theories teachers hold about the gifted influence their perception of and behavior toward highly able students, thus impacting the latter's educational opportunities. Two persistent stereotypes about the gifted can be distinguished: the harmony hypothesis (gifted students are superior in almost all domains) and the disharmony hypothesis (giftedness implies maladaptive social behavior and emotional problems). The present study investigated whether teachers' implicit personality theories about the gifted are in line with the harmony or the disharmony hypothesis. Using an experimental vignette approach, we examined 321 prospective and practicing teachers' implicit personality theories (based on the big five personality framework) about students described along three dimensions (ability level, gender, and age, resulting in 8 different vignettes), controlling for teachers' age, gender, experience with gifted students, and knowledge about giftedness. Ability level had the strongest effect on teachers' ratings (partial η² = .60). Students described as gifted were perceived as more open to new experiences, more introverted, less emotionally stable, and less agreeable (all ps < .001). No differences were found for conscientiousness. Gender and its interaction with ability level had a small effect (partial η²s = .04 and .03). Thus, teachers' implicit personality theories about the gifted were in line with the disharmony hypothesis. Possible consequences for gifted identification and education are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. Acting like a physicist: Student approach study to experimental design

    Karelina, Anna; Etkina, Eugenia


    National studies of science education have unanimously concluded that preparing our students for the demands of the 21st century workplace is one of the major goals. This paper describes a study of student activities in introductory college physics labs, which were designed to help students acquire abilities that are valuable in the workplace. In these labs [called Investigative Science Learning Environment (ISLE) labs], students design their own experiments. Our previous studies have shown that students in these labs acquire scientific abilities such as the ability to design an experiment to solve a problem, the ability to collect and analyze data, the ability to evaluate assumptions and uncertainties, and the ability to communicate. These studies mostly concentrated on analyzing students’ writing, evaluated by specially designed scientific ability rubrics. Recently, we started to study whether the ISLE labs make students not only write like scientists but also engage in discussions and act like scientists while doing the labs. For example, do students plan an experiment, validate assumptions, evaluate results, and revise the experiment if necessary? A brief report of some of our findings that came from monitoring students’ activity during ISLE and nondesign labs was presented in the Physics Education Research Conference Proceedings. We found differences in student behavior and discussions that indicated that ISLE labs do in fact encourage a scientistlike approach to experimental design and promote high-quality discussions. This paper presents a full description of the study.

  18. Experimental approach for adhesion strength of ATF cladding

    Kim, Donghyun; Kim, Hyochan; Yang, Yongsik; In, Wangkee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Haksung [Hanyang University, Seoul (Korea, Republic of)


    The quality of a coating depends on the quality of its adhesion bond strength between the coating and the underlying substrate. Therefore, it is essential to evaluate the adhesion properties of the coating. There are many available test methods for the evaluation of coatings adhesion bond strength. Considering these restrictions of the coated cladding, the scratch test is useful for evaluation of adhesion properties compared to other methods. The purpose of the present study is to analyze the possibility of adhesion bond strength evaluation of ATF coated cladding by scratch testing on coatings cross sections. Experimental approach for adhesion strength of ATF coated cladding was investigated in the present study. The scratch testing was chosen as a testing method. Uncoated zircaloy-4 tube was employed as a reference and plasma spray and arc ion coating were selected as a ATF coated claddings for comparison. As a result, adhesion strengths of specimens affect the measured normal and tangential forces. For the future, the test will be conducted for CrAl coated cladding by laser coating, which is the most promising ATF cladding. Computational analysis with finite element method will also be conducted to analyze a stress distribution in the cladding tube.

  19. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    Iskandar, Ismed; Satria Gondokaryono, Yudi


    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  20. Design for reliability of BEoL and 3-D TSV structures - A joint effort of FEA and innovative experimental techniques

    Auersperg, Jürgen; Vogel, Dietmar; Auerswald, Ellen; Rzepka, Sven; Michel, Bernd


    Copper-TSVs for 3D-IC-integration generate novel challenges for reliability analysis and prediction, e.g. the need to master multiple failure criteria for combined loading including residual stress, interface delamination, cracking and fatigue issues. So, the thermal expansion mismatch between copper and silicon leads to a stress situation in silicon surrounding the TSVs which is influencing the electron mobility and as a result the transient behavior of transistors. Furthermore, pumping and protrusion of copper is a challenge for Back-end of Line (BEoL) layers of advanced CMOS technologies already during manufacturing. These effects depend highly on the temperature dependent elastic-plastic behavior of the TSV-copper and the residual stresses determined by the electro deposition chemistry and annealing conditions. That's why the authors pushed combined simulative/experimental approaches to extract the Young's-modulus, initial yield stress and hardening coefficients in copper-TSVs from nanoindentation experiments, as well as the temperature dependent initial yield stress and hardening coefficients from bow measurements due to electroplated thin copper films on silicon under thermal cycling conditions. A FIB trench technique combined with digital image correlation is furthermore used to capture the residual stress state near the surface of TSVs. The extracted properties are discussed and used accordingly to investigate the pumping and protrusion of copper-TSVs during thermal cycling. Moreover, the cracking and delamination risks caused by the elevated temperature variation during BEoL ILD deposition are investigated with the help of fracture mechanics approaches.

  1. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Taki, Faten A; Abdel-Rahman, Abdel A; Zhang, Baohong


    Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2) is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR). The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs) as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder) was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO), ovariectomized rats in the absence (OVX) or presence of E2 (OVXE2). These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  2. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Faten A Taki

    Full Text Available Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2 is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR. The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO, ovariectomized rats in the absence (OVX or presence of E2 (OVXE2. These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  3. A new approach for interexaminer reliability data analysis on dental caries calibration

    Andréa Videira Assaf


    Full Text Available Objectives: a to evaluate the interexaminer reliability in caries detection considering different diagnostic thresholds and b to indicate, by using Kappa statistics, the best way of measuring interexaminer agreement during the calibration process in dental caries surveys. Methods: Eleven dentists participated in the initial training, which was divided into theoretical discussions and practical activities, and calibration exercises, performed at baseline, 3 and 6 months after the initial training. For the examinations of 6-7-year-old schoolchildren, the World Health Organization (WHO recommendations were followed and different diagnostic thresholds were used: WHO (decayed/missing/filled teeth - DMFT index and WHO + IL (initial lesion diagnostic thresholds. The interexaminer reliability was calculated by Kappa statistics, according to WHO and WHO+IL thresholds considering: a the entire dentition; b upper/lower jaws; c sextants; d each tooth individually. Results: Interexaminer reliability was high for both diagnostic thresholds; nevertheless, it decreased in all calibration sections when considering teeth individually. Conclusion: The interexaminer reliability was possible during the period of 6 months, under both caries diagnosis thresholds. However, great disagreement was observed for posterior teeth, especially using the WHO+IL criteria. Analysis considering dental elements individually was the best way of detecting interexaminer disagreement during the calibration sections.

  4. An estimation-based approach for range image segmentation: on the reliability of primitive extraction

    Wang, Guoyu; Houkes, Zweitze; Ji, Guangrong; Zheng, Bing; Li, Xin


    This paper presents a new algorithm for estimation-based range image segmentation. Aiming at surface-primitive extraction from range data, we focus on the reliability of the primitive representation in the process of region estimation. We introduce an optimal description of surface primitives, by wh

  5. Estimating reliability coefficients with heterogeneous item weightings using Stata: A factor based approach

    Boermans, M.A.; Kattenberg, M.A.C.


    We show how to estimate a Cronbach's alpha reliability coefficient in Stata after running a principal component or factor analysis. Alpha evaluates to what extent items measure the same underlying content when the items are combined into a scale or used for latent variable. Stata allows for testing

  6. Reliable Devanagri Handwritten Numeral Recognition using Multiple Classifier and Flexible Zoning Approach

    Pratibha Singh


    Full Text Available A reliability evaluation system for the recognition of Devanagri Numerals is proposed in this paper. Reliability of classification is very important in applications of optical character recognition. As we know that the outliers and ambiguity may affect the performance of recognition system, a rejection measure must be there for the reliable recognition of the pattern. For each character image pre-processing steps like normalization, binarization, noise removal and boundary extraction is performed. After calculating the bounding box features are extracted for each partition of the numeral image. Features are calculated on three different zoning methods. Directional feature is considered which is obtained using chain code and gradient direction quantization of the orientations. The Zoning firstly, is considered made up of uniform partitions and secondly of non-uniform compartments based on the density of the pixels. For classification 1-nearest neighbor based classifier, quadratic bayes classifier and linear bayes classifier are chosen as base classifier. The base classifiers are combined using four decision combination rules namely maximum, Median, Average and Majority Voting. The framework is used to test the reliability of recognition system against ambiguity.

  7. Scenario based approach to structural damage detection and its value in a risk and reliability perspective

    Hovgaard, Mads Knude; Hansen, Jannick Balleby; Brincker, Rune


    A scenario- and vibration based structural damage detection method is demonstrated though simulation. The method is Finite Element (FE) based. The value of the monitoring is calculated using structural reliability theory. A high cycle fatigue crack propagation model is assumed as the damage mecha...

  8. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Gerhard J.


    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  9. Fluid-Solid Interaction and Multiscale Dynamic Processes: Experimental Approach

    Arciniega-Ceballos, Alejandra; Spina, Laura; Mendo-Pérez, Gerardo M.; Guzmán-Vázquez, Enrique; Scheu, Bettina; Sánchez-Sesma, Francisco J.; Dingwell, Donald B.


    . Analysis of time series, both experimental and synthetics, synchronized with high-speed imaging enables the explanation and interpretation of distinct phases of the dynamics of these fluids and the extraction of time and frequency characteristics of the individual processes. We observed that the effects of both, pressure drop triggering function and viscosity, control the characteristics of the micro-signals in time and frequency. This suggests the great potential that experimental and numerical approaches provide to untangle from field volcanic seismograms the multiscale processes of the stress field, driving forces and fluid-rock interaction that determine the volcanic conduit dynamics.

  10. Reliability of serum metabolite concentrations over a 4-month period using a targeted metabolomic approach.

    Anna Floegel

    Full Text Available Metabolomics is a promising tool for discovery of novel biomarkers of chronic disease risk in prospective epidemiologic studies. We investigated the between- and within-person variation of the concentrations of 163 serum metabolites over a period of 4 months to evaluate the metabolite reliability expressed by the intraclass-correlation coefficient (ICC: the ratio of between-person variance and total variance. The analyses were performed with the BIOCRATES AbsoluteIDQ™ targeted metabolomics technology, including acylcarnitines, amino acids, glycerophospholipids, sphingolipids and hexose in 100 healthy individuals from the European Prospective Investigation into Cancer and Nutrition (EPIC-Potsdam study who had provided two fasting blood samples 4 months apart. Overall, serum reliability of metabolites over a 4-month period was good. The median ICC of the 163 metabolites was 0.57. The highest ICC was observed for hydroxysphingomyelin C14:1 (ICC = 0.85 and the lowest was found for acylcarnitine C3:1 (ICC = 0. Reliability was high for hexose (ICC = 0.76, sphingolipids (median ICC = 0.66; range: 0.24-0.85, amino acids (median ICC = 0.58; range: 0.41-0.72 and glycerophospholipids (median ICC = 0.58; range: 0.03-0.81. Among acylcarnitines, reliability of short and medium chain saturated compounds was good to excellent (ICC range: 0.50-0.81. Serum reliability was lower for most hydroxyacylcarnitines and monounsaturated acylcarnitines (ICC range: 0.11-0.45 and 0.00-0.63, respectively. For most of the metabolites a single measurement may be sufficient for risk assessment in epidemiologic studies with healthy subjects.

  11. Reliability Engineering

    Lazzaroni, Massimo


    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  12. New experimental approaches to the biology of flight control systems.

    Taylor, Graham K; Bacic, Marko; Bomphrey, Richard J; Carruthers, Anna C; Gillies, James; Walker, Simon M; Thomas, Adrian L R


    Here we consider how new experimental approaches in biomechanics can be used to attain a systems-level understanding of the dynamics of animal flight control. Our aim in this paper is not to provide detailed results and analysis, but rather to tackle several conceptual and methodological issues that have stood in the way of experimentalists in achieving this goal, and to offer tools for overcoming these. We begin by discussing the interplay between analytical and empirical methods, emphasizing that the structure of the models we use to analyse flight control dictates the empirical measurements we must make in order to parameterize them. We then provide a conceptual overview of tethered-flight paradigms, comparing classical ;open-loop' and ;closed-loop' setups, and describe a flight simulator that we have recently developed for making flight dynamics measurements on tethered insects. Next, we provide a conceptual overview of free-flight paradigms, focusing on the need to use system identification techniques in order to analyse the data they provide, and describe two new techniques that we have developed for making flight dynamics measurements on freely flying birds. First, we describe a technique for obtaining inertial measurements of the orientation, angular velocity and acceleration of a steppe eagle Aquila nipalensis in wide-ranging free flight, together with synchronized measurements of wing and tail kinematics using onboard instrumentation and video cameras. Second, we describe a photogrammetric method to measure the 3D wing kinematics of the eagle during take-off and landing. In each case, we provide demonstration data to illustrate the kinds of information available from each method. We conclude by discussing the prospects for systems-level analyses of flight control using these techniques and others like them.

  13. Reliable Gait Recognition Using 3D Reconstructions and Random Forests - An Anthropometric Approach

    Sandau, Martin; Heimbürger, Rikke V.; Jensen, Karl E.


    Photogrammetric measurements of bodily dimensions and analysis of gait patterns in CCTV are important tools in forensic investigations but accurate extraction of the measurements are challenging. This study tested whether manual annotation of the joint centers on 3D reconstructions could provide...... reliable recognition. Sixteen participants performed normal walking where 3D reconstructions were obtained continually. Segment lengths and kinematics from the extremities were manually extracted by eight expert observers. The results showed that all the participants were recognized, assuming the same...... expert annotated the data. Recognition based on data annotated by different experts was less reliable achieving 72.6% correct recognitions as some parameters were heavily affected by interobserver variability. This study verified that 3D reconstructions are feasible for forensic gait analysis...

  14. Substation design improvement with a probabilistic reliability approach using the TOPASE program

    Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)


    TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.

  15. Reliable Gait Recognition Using 3D Reconstructions and Random Forests - An Anthropometric Approach.

    Sandau, Martin; Heimbürger, Rikke V; Jensen, Karl E; Moeslund, Thomas B; Aanaes, Henrik; Alkjaer, Tine; Simonsen, Erik B


    Photogrammetric measurements of bodily dimensions and analysis of gait patterns in CCTV are important tools in forensic investigations but accurate extraction of the measurements are challenging. This study tested whether manual annotation of the joint centers on 3D reconstructions could provide reliable recognition. Sixteen participants performed normal walking where 3D reconstructions were obtained continually. Segment lengths and kinematics from the extremities were manually extracted by eight expert observers. The results showed that all the participants were recognized, assuming the same expert annotated the data. Recognition based on data annotated by different experts was less reliable achieving 72.6% correct recognitions as some parameters were heavily affected by interobserver variability. This study verified that 3D reconstructions are feasible for forensic gait analysis as an improved alternative to conventional CCTV. However, further studies are needed to account for the use of different clothing, field conditions, etc.

  16. Calibration Of Self-Reported Dietary Measures Using Biomarkers: An Approach To Enhancing Nutritional Epidemiology Reliability

    Prentice, Ross L.; Tinker, Lesley F.; Huang, Ying; Neuhouser, Marian L.


    Reports from nutritional epidemiology studies lack reliability if based solely on self-reported dietary consumption estimates. Consumption biomarkers are available for some components of diet. These can be collected in subsets of study cohorts, along with corresponding self-report assessments. Linear regression of (log-transformed) biomarker values on corresponding self-report values and other pertinent study subject characteristics yields calibration equations for dietary consumption, from w...


    Hsiang LEE; Way KUO; Chunghun HA


    Two heuristics, the max-min approach and the Nakagawa and Nakashima method, are consideredfor the redundancy allocation problem with series-parallel structure. The max-min approach canformulate the problem as an integer linear programming problem instead of an integer nonlinearproblem. This paper presents a comparison between those methods from the standpoint of solutionquality and computational complexity. The experimental results show that the max-min approach issuperior to the Nakagawa and Nakashima method in terms of solution quality in small-scale problems,but analysis of computational complexity shows that the max-min approach is inferior to other greedyheuristics.

  18. Mechanical behaviour of the heel pad: experimental and numerical approach

    Matteoli, Sara; Fontanella, C. G.; Virga, A.

    The aim of the present work was to investigate the stress relaxation phenomena of the heel pad region under different loading conditions. A 31-year-old healthy female was enrolled in this study and her left foot underwent both MRI and experimental compression tests. Experimental results were...... during daily activities....

  19. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    Komarov, Yu. A.


    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  20. A new approach to numerical analysis of reliability indices in electronics

    Geniy Kuznetsov


    Full Text Available Spatial modeling of unsteady temperature fields is conducted in a microelectronic printed circuit board (PCB with an account of convective and radiation heat transfer with the environment. The data for numerical modeling of temperature fields serve as a basis for determining the aging characteristics of the polymer material as a structural component of electronic engineering products. The obtained results allow concluding on the necessity to consider spatial nonuniform temperature fields when estimating the degree of polymeric materials degradation at the continuous service of products, as well as on the impact of polymer aging on reliability features of microelectronic devices.

  1. Statistical approach for uncertainty quantification of experimental modal model parameters

    Luczak, M.; Peeters, B.; Kahsin, M.


    estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  2. A reliable methodology for monitoring unstable slopes: the multi-platform and multi-sensor approach

    Castagnetti, Cristina; Bertacchini, Eleonora; Corsini, Alessandro; Rivola, Riccardo


    High resolution topography, by involving Digital Terrain Models (DTMs) and further accurate techniques for a proper displacement identification, is a valuable tool for a good and reliable description of unstable slopes. By comparing multitemporal surveys, the geomorphology of a landslide may be analyzed as well as the changes over time, the volumes transportation and the boundaries evolution. Being aware that a single technique is not sufficient to perform a reliable and accurate survey, this paper discusses the use of multi-platform, multi-source and multi-scale observations (both in terms of spatial scale and time scale) for the study and monitoring of unstable slopes. The final purpose is to highlight and validate a methodology based on multiple sensors and data integration, useful to obtain a comprehensive GIS (Geographic Information System) which can successfully be used to manage natural disasters or to improve the knowledge of a specific phenomenon in order to prevent and mitigate the hydro-geological risk. The novelty of the present research lies in the spatial integration of multiple remote sensing techniques such as: integration of Airborne Laser Scanning (ALS) and Terrestrial Laser Scanning (TLS) to provide a comprehensive and accurate surface description (DTM) at a fixed epoch (spatial continuity); continuous monitoring by means of spatial integration of Automated Total Station (ATS) and GNSS (Global Navigation Satellite System) to provide accurate surface displacement identification (time continuity). Discussion makes reference to a rockslide located in the northern Apennines of Italy from 2010 to 2013.

  3. A System Approach to Advanced Practice Clinician Standardization and High Reliability.

    Okuno-Jones, Susan; Siehoff, Alice; Law, Jennifer; Juarez, Patricia

    Advanced practice clinicians (APCs) are an integral part of the health care team. Opportunities exist within Advocate Health Care to standardize and optimize APC practice across the system. To enhance the role and talents of APCs, an approach to role definition and optimization of practice and a structured approach to orientation and evaluation are shared. Although in the early stages of development, definition and standardization of accountabilities in a framework to support system changes are transforming the practice of APCs.

  4. Critical aspects of integrated monitoring systems for landslides risk management: strategies for a reliable approach

    Castagnetti, C.; Bertacchini, E.; Capra, A.; Corsini, A.


    The use of advanced technologies for remotely monitor surface processes is a successful way for improving the knowledge of phenomena evolution. In addition, the integration of various techniques is becoming more and more common in order to implement early warning systems that can monitor the evolution of landslides in time and prevent emergencies. The reliability of those systems plays a key role when Public Administrations have to plan actions in case of disasters or for preventing an incoming emergency. To have confidence in the information given by the system is an essential condition for a successful policy aiming to protect the population. The research deals with the major critical aspects to be taken into account when implementing a reliable monitoring system for unstable slopes. The importance of those aspects is often neglected, unlike the effects of a not careful implementation and management of the system can lead to erroneous interpretations of the phenomenon itself. The case study which ruled the research and highlighted the actual need of guidelines for setting up a reliable monitoring system is the Valoria landslide, located in the Northern Italy. The system is based on the integration of an automatic Total Station (TS) measuring 45 reflectors and a master GPS, acting as the reference station for three rovers placed within the landslide. In order to monitor local disturbing effects, a bi-dimensional clinometer has been applied on the TS pillar. Topographic measurements have been also integrated with geotechnical sensors (inclinometers and piezometers) in a GIS for landslide risk management. At the very beginning, periodic measurements were carried out, while the system is now performing continuously since 2008. The system permitted to evaluate movements from few millimeter till some meters per day in most dangerous areas. A more spatially continuous description has been also provided by LiDAR and terrestrial SAR interferometry. Some of the most

  5. A new approach to provide high-reliability data systems without using space-qualified electronic components

    Haebel, Wolfgang


    This paper describes the present situation and the expected trends with regard to the availability of electronic components, their quality levels, technology trends and sensitivity to the space environment. Many recognized vendors have already discontinued their MIL production line and state of the art components will in many cases not be offered in this quality level because of the shrinking market. It becomes therefore obvious that new methods need to be considered "How to build reliable Data Systems for space applications without High-Rel parts". One of the most promising approaches is the identification, masking and suppression of faults by developing fault-tolerant computer systems which is described in this paper.

  6. A New Approach for Analyzing the Reliability of the Repair Facility in a Series System with Vacations

    Renbin Liu


    Full Text Available Based on the renewal process theory we develop a decomposition method to analyze the reliability of the repair facility in an n-unit series system with vacations. Using this approach, we study the unavailability and the mean replacement number during (0,t] of the repair facility. The method proposed in this work is novel and concise, which can make us see clearly the structures of the facility indices of a series system with an unreliable repair facility, two convolution relations. Special cases and numerical examples are given to show the validity of our method.

  7. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    Morrison, Geoffrey Stewart


    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle.

  8. In-Store Experimental Approach to Pricing and Consumer Behavior

    Sigurdsson, Valdimar; Foxall, Gordon; Saevarsson, Hugi


    This study assessed how, and to what extent, it is possible to use behavioral experimentation and relative sales analysis to study the effects of price on consumers' brand choices in the store environment. An in-store experiment was performed in four stores to investigate the effects of different prices of a target brand on consumers' relative…

  9. Comparison as an Approach to the Experimental Method

    Turner, David A.


    In his proposal for comparative education, Marc Antoinne Jullien de Paris argues that the comparative method offers a viable alternative to the experimental method. In an experiment, the scientist can manipulate the variables in such a way that he or she can see any possible combination of variables at will. In comparative education, or in…

  10. In-Store Experimental Approach to Pricing and Consumer Behavior

    Sigurdsson, Valdimar; Foxall, Gordon; Saevarsson, Hugi


    This study assessed how, and to what extent, it is possible to use behavioral experimentation and relative sales analysis to study the effects of price on consumers' brand choices in the store environment. An in-store experiment was performed in four stores to investigate the effects of different prices of a target brand on consumers' relative…

  11. Experimental typography : reviewing the modernist and the current approaches

    Makal, Eray


    Ankara : The Department of Graphic Design and Institute of Fine Arts, Bilkent Univ., 1993. Thesis (Master's) -- Bilkent University, 1993. Includes bibliographical references leaves 65-66. The intention of this study is to evaluate the experimental typography within the history of graphic design by taking in consideration of two epochs. The Modernist and The Current. Makal, Eray M.S.

  12. Evaluating a novel approach to reliability decision support for offshore wind turbine installation

    Gintautas, Tomas; Sørensen, John Dalsgaard


    This paper briefly describes a novel approach of estimating weather windows for decision support in offshore wind turbine installation projects. The proposed methodology is based on statistical analysis of extreme physical responses of the installation equipment (such as lifting cable loads, moti...

  13. A cyber-physical approach to experimental fluid mechanics

    Mackowski, Andrew Williams

    This Thesis documents the design, implementation, and use of a novel type of experimental apparatus, termed Cyber-Physical Fluid Dynamics (CPFD). Unlike traditional fluid mechanics experiments, CPFD is a general-purpose technique that allows one to impose arbitrary forces on an object submerged in a fluid. By combining fluid mechanics with robotics, we can perform experiments that would otherwise be incredibly difficult or time-consuming. More generally, CPFD allows a high degree of automation and control of the experimental process, allowing for much more efficient use of experimental facilities. Examples of CPFD's capabilites include imposing a gravitational force in the horizontal direction (allowing a test object to "fall" sideways in a water channel), simulating nonlinear springs for a vibrating fluid-structure system, or allowing a self-propelled body to move forward under its own force. Because experimental parameters (including forces and even the mass of the test object) are defined in software, one can define entire ensembles of experiments to run autonomously. CPFD additionally integrates related systems such as water channel speed control, LDV flow speed measurements, and PIV flowfield measurements. The end result is a general-purpose experimental system that opens the door to a vast array of fluid-structure interaction problems. We begin by describing the design and implementation of CPFD, the heart of which is a high-performance force-feedback control system. Precise measurement of time-varying forces (including removing effects of the test object's inertia) is more critical here than in typical robotic force-feedback applications. CPFD is based on an integration of ideas from control theory, fluid dynamics, computer science, electrical engineering, and solid mechanics. We also describe experiments using the CPFD experimental apparatus to study vortex-induced vibration (VIV) and oscillating-airfoil propulsion. We show how CPFD can be used to simulate

  14. DG Allocation Based on Reliability, Losses and Voltage Sag Considerations: an expert system approach

    Sahar Abdel Moneim Moussa


    Full Text Available Expert System (ES as a branch of Artificial Intelligence (AI methodology can potentially help in solving complicated power system problems. This may be more appropriate methodology than conventional optimization techniques when contradiction between objectives appears in reaching the optimum solution. When this contradiction is the hindrance in reaching the required system operation through the application of traditional methods ES can give a hand in such case. In this paper, the  knowledge- based ES technique is proposed to reach near-optimum solution which is further directed to the optimum solution through particle swarm optimization (PSO technique. This idea is known as Hybrid-Expert-System (HES. The proposed idea is used in getting the optimum allocation of a number of distributed generation (DG units on Distribution System (DS busbars taking into consideration three issues; reliability, voltage sag, and line losses. Optimality is assessed on the economic basis by calculating money benefits (or losses resulting from DG addition considering the three aforementioned issues. The effectiveness of the proposed technique is ascertained through example.

  15. Toward formal analysis of ultra-reliable computers: A total systems approach

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.


    This paper describes the application of modeling and analysis techniques to software that is designed to execute on four channel version of the the Charles Stark Draper Laboratory (CSDL) Fault-Tolerant Processor, referred to as the Draper FTP. The software performs sensor validation of four independent measures (singlas) from the primary pumps of the Experimental Breeder Reactor-II operated by Argonne National Laboratory-West, and from the validated signals formulates a flow trip signal for the reactor safety system. 11 refs., 4 figs.

  16. Coal miner's pneumoconiosis: epidemiological and experimental approaches

    Amoudru, C.


    A historical review is presented showing the belated recognition of coal miner's pneumoconiosis as a discrete pathological state, a fact due to a lack of experimental pathological research. The distribution of the disease in France was studied. Its incidence has decreased among active miners, but the number of cases has increased as a result of increased longevity of miners. Physiological and pathological concepts of the disease are discussed, which has become a post-professional disease with delayed radiological symptoms, which frequently entails chronic cor pulmonale, and which varies in incidence from one mining region to another. Lines of ongoing experimental research, which take account of new work in epidemiology and in dust analysis as well as recent biological studies on man are summarized. 29 references.

  17. Morphodynamics of submarine channel inception revealed by new experimental approach

    de Leeuw, Jan; Eggenhuisen, Joris T.; Cartigny, Matthieu J. B.


    Submarine channels are ubiquitous on the seafloor and their inception and evolution is a result of dynamic interaction between turbidity currents and the evolving seafloor. However, the morphodynamic links between channel inception and flow dynamics have not yet been monitored in experiments and only in one instance on the modern seafloor. Previous experimental flows did not show channel inception, because flow conditions were not appropriately scaled to sustain suspended sediment transport. Here we introduce and apply new scaling constraints for similarity between natural and experimental turbidity currents. The scaled currents initiate a leveed channel from an initially featureless slope. Channelization commences with deposition of levees in some slope segments and erosion of a conduit in other segments. Channel relief and flow confinement increase progressively during subsequent flows. This morphodynamic evolution determines the architecture of submarine channel deposits in the stratigraphic record and efficiency of sediment bypass to the basin floor.

  18. Impact of instructional approach on students' epistemologies about experimental physics

    Wilcox, Bethany R


    Student learning in undergraduate physics laboratory courses has garnered increased attention within the PER community. Considerable work has been done to develop curricular materials and pedagogical techniques designed to enhance student learning within laboratory learning environments. Examples of these transformation efforts include the Investigative Science Learning Environment (ISLE), Modeling Instruction, and integrated lab/lecture environments (e.g., studio physics). In addition to improving students' understanding of the physics content, lab courses often have an implicit or explicit goal of increasing students' understanding and appreciation of the nature of experimental physics. We examine students' responses to a laboratory-focused epistemological assessment -- the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) -- to explore whether courses using transformed curricula or pedagogy show more expert-like student epistemologies relative to courses using traditional ...

  19. A New Two-Step Approach for Hands-On Teaching of Gene Technology: Effects on Students' Activities During Experimentation in an Outreach Gene Technology Lab

    Scharfenberg, Franz-Josef; Bogner, Franz X.


    Emphasis on improving higher level biology education continues. A new two-step approach to the experimental phases within an outreach gene technology lab, derived from cognitive load theory, is presented. We compared our approach using a quasi-experimental design with the conventional one-step mode. The difference consisted of additional focused discussions combined with students writing down their ideas (step one) prior to starting any experimental procedure (step two). We monitored students' activities during the experimental phases by continuously videotaping 20 work groups within each approach ( N = 131). Subsequent classification of students' activities yielded 10 categories (with well-fitting intra- and inter-observer scores with respect to reliability). Based on the students' individual time budgets, we evaluated students' roles during experimentation from their prevalent activities (by independently using two cluster analysis methods). Independently of the approach, two common clusters emerged, which we labeled as `all-rounders' and as `passive students', and two clusters specific to each approach: `observers' as well as `high-experimenters' were identified only within the one-step approach whereas under the two-step conditions `managers' and `scribes' were identified. Potential changes in group-leadership style during experimentation are discussed, and conclusions for optimizing science teaching are drawn.

  20. A probabilistic fracture mechanics approach for structural reliability assessment of space flight systems

    Sutharshana, S.; Creager, M.; Ebbeler, D.; Moore, N.


    A probabilistic fracture mechanics approach for predicting the failure life distribution due to subcritical crack growth is presented. A state-of-the-art crack propagation method is used in a Monte Carlo simulation to generate a distribution of failure lives. The crack growth failure model expresses failure life as a function of stochastic parameters including environment, loads, material properties, geometry, and model specification errors. A stochastic crack growth rate model that considers the uncertainties due to scatter in the data and mode misspecification is proposed. The rationale for choosing a particular type of probability distribution for each stochastic input parameter and for specifying the distribution parameters is presented. The approach is demonstrated through a probabilistic crack growth failure analysis of a welded tube in the Space Shuttle Main Engine. A discussion of the results from this application of the methodology is given.

  1. A model order reduction approach to construct efficient and reliable virtual charts in computational homogenisation

    Kerfriden, Pierre; Goury, Olivier; Khac Chi, Hoang; Bordas, Stéphane


    Computational homogenisation is a widely spread technique to calculate the overall properties of a composite material from the knowledge of the constitutive laws of its microscopic constituents [1, 2]. Indeed, it relies on fewer assumptions than analytical or semi-analytical homogenisation approaches and can be used to coarse-grain a large range of micro-mechanical models. However, this accuracy comes at large computational costs, which prevents computational homogenisation from b...

  2. From integrated control to integrated farming, an experimental approach

    Vereijken, P.H.


    Integrated control or integrated pest management (IPM), as envisaged originally, is not being practised to any large extent in arable farming, notwithstanding considerable research efforts. The reasons for this are discussed. A more basic approach called integrated farming is suggested. Preliminary

  3. Synergy between experimental and computational approaches to homogeneous photoredox catalysis.

    Demissie, Taye B; Hansen, Jørn H


    In this Frontiers article, we highlight how state-of-the-art density functional theory calculations can contribute to the field of homogeneous photoredox catalysis. We discuss challenges in the fields and potential solutions to be found at the interface between theory and experiment. The exciting opportunities and insights that can arise through such an interdisciplinary approach are highlighted.

  4. A test-retest reliability study of human experimental models of histaminergic and non-histaminergic itch

    Andersen, Hjalte Holm; Sørensen, Anne-Kathrine R.; Nielsen, Gebbie A. R.


    Numerous exploratory, proof-of-concept and interventional studies have used histaminergic and non-histaminergic human models of itch. However, no reliability studies for such surrogate models have been conducted. This study investigated the test-retest reliability for the response to histamine......- and cowhage- (5, 15, 25 spiculae) induced itch in healthy volunteers. Cowhage spiculae were individually applied with tweezers and 1% histamine was applied with a skin prick test (SPT) lancet, both on the volar forearm. The intensity of itch was recorded on a visual analogue scale and self-reported area...

  5. The impact of loneliness on paranoia: An experimental approach.

    Lamster, Fabian; Nittel, Clara; Rief, Winfried; Mehl, Stephanie; Lincoln, Tania


    Loneliness is a common problem in patients with schizophrenia, and may be particularly linked with persecutory ideation. Nevertheless, its role as a potential risk factor in the formation and maintenance of persecutory delusions is largely unexplored. Loneliness was experimentally manipulated using a false-feedback paradigm in a non-clinical sample (n = 60). Change in state paranoia was compared between the induction of increased loneliness, the induction of reduced loneliness and a control condition. Distinct associations between pre-post scores of loneliness and state paranoia were examined at three (medium/high/low) levels of proneness to psychosis across the experimental conditions. Reduction of loneliness was associated with a significant reduction of present paranoid beliefs, while induction of loneliness lead to more pronounced paranoia on trend significance level. Moreover, proneness to psychosis significantly moderated the impact of loneliness on paranoia. Persons with a pronounced level of proneness to psychosis showed a stronger reduction of paranoid beliefs as a consequence of a decrease in loneliness, than less prone individuals. A limitation is the small size of our sample, which may have limited the power to detect significant within-group changes in state paranoia in the high-loneliness condition and changes in loneliness in the low-loneliness condition. The findings support the feasibility of the experimental design to manipulate loneliness and suggest that loneliness could be a cause of paranoia. However, the findings need to be confirmed in high risk samples to draw conclusions about the role of loneliness in the genesis of clinically relevant levels of paranoia and derive implications for cognitive behaviour therapy. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Kriging approach for the experimental cross-section covariances estimation

    Garlaud A.


    Full Text Available In the classical use of a generalized χ2 to determine the evaluated cross section uncertainty, we need the covariance matrix of the experimental cross sections. The usual propagation error method to estimate the covariances is hardly usable and the lack of data prevents from using the direct empirical estimator. We propose in this paper to apply the kriging method which allows to estimate the covariances via the distances between the points and with some assumptions on the covariance matrix structure. All the results are illustrated with the 2555Mn nucleus measurements.

  7. Reliable experimental setup to test the pressure modulation of Baerveldt Implant tubes for reducing post-operative hypotony

    Ramani, Ajay

    future to evaluate custom inserts and their effects on the pressure drop over 4 -- 6 weeks. The design requirements were: simulate physiological conditions [flow rate between 1.25 and 2.5 mul/min], evaluate small inner diameter tubes [50 and 75 mum] and annuli, and demonstrate reliability and repeatability. The current study was focused on benchmarking the experimental setup for the IOP range of 15 -- 20 mm Hg. Repeated experiments have been conducted using distilled water with configurations [diameter of tube, insert diameter, lengths of insert and tube, and flow rate] that produce pressure variations which include the 15 -- 20 mm Hg range. Two similar setups were assembled and evaluated for repeatability between the two. Experimental measurements of pressure drop were validated using theoretical calculations. Theory predicted a range of expected values by considering manufacturing and performance tolerances of the apparatus components: tube diameter, insert diameter, and the flow-rate and pressure [controlled by pump]. In addition, preliminary experiments evaluated the dissolution of suture samples in a balanced salt solution and in distilled water. The balanced salt solution approximates the eye's aqueous humor properties, and it was expected that the salt and acid would help to hydrolyze sutures much faster than distilled water. Suture samples in a balanced salt solution showed signs of deterioration [flaking] within 23 days, and distilled water samples showed only slight signs of deterioration after about 30 days. These preliminary studies indicate that future dissolution and flow experiments should be conducted using the balanced salt solution. Also, the absorbable sutures showed signs of bulk erosion/deterioration in a balanced salt solution after 14 days, which indicates that they may not be suitable as inserts in the implant tubes because flakes could block the tube entrance. Further long term studies should be performed in order to understand the effects of

  8. Single Component Sorption-Desorption Test Experimental Design Approach Discussions

    Phil WInston


    A task was identified within the fission-product-transport work package to develop a path forward for doing testing to determine behavior of volatile fission products behavior and to engage members of the NGNP community to advise and dissent on the approach. The following document is a summary of the discussions and the specific approaches suggested for components of the testing. Included in the summary isare the minutes of the conference call that was held with INL and external interested parties to elicit comments on the approaches brought forward by the INL participants. The conclusion was that an initial non-radioactive, single component test will be useful to establish the limits of currently available chemical detection methods, and to evaluated source-dispersion uniformity. In parallel, development of a real-time low-concentration monitoring method is believed to be useful in detecting rapid dispersion as well as desorption phenomena. Ultimately, the test cycle is expected to progress to the use of radio-traced species, simply because this method will allow the lowest possible detection limits. The consensus of the conference call was that there is no need for an in-core test because the duct and heat exchanger surfaces that will be the sorption target will be outside the main neutron flux and will not be affected by irradiation. Participants in the discussion and contributors to the INL approach were Jeffrey Berg, Pattrick Calderoni, Gary Groenewold, Paul Humrickhouse, Brad Merrill, and Phil Winston. Participants from outside the INL included David Hanson of General Atomics, Todd Allen, Tyler Gerczak, and Izabela Szlufarska of the University of Wisconsin, Gary Was, of the University of Michigan, Sudarshan Loyalka and Tushar Ghosh of the University of Missouri, and Robert Morris of Oak Ridge National Laboratory.

  9. Experimental approach of the single pedestrian-induced excitation

    Kala, J.; Bajer, M.; Barnat, J.; Smutný, J.


    Pedestrian-induced vibrations are a criterion for serviceability. This loading is significant for light-weight footbridge structures, but was established as a basic loading for the ceilings of various ordinary buildings. Wide variations of this action exist. To verify the different conclusions of various authors, vertical pressure measurements invoked during walking were performed. In the article the approaches of different design codes are also shown.

  10. Experimental approach for thermal parameters estimation during glass forming process

    Abdulhay, B.; Bourouga, B.; Alzetto, F.; Challita, C.


    In this paper, an experimental device designed and developedto estimate thermal conditions at the Glass / piston contact interface is presented. This deviceis made of two parts: the upper part contains the piston made of metal and a heating device to raise the temperature of the piston up to 500 °C. The lower part is composed of a lead crucible and a glass sample. The assembly is provided with a heating system, an induction furnace of 6 kW for heating the glass up to 950 °C.The developed experimental procedure has permitted in a previous published study to estimate the Thermal Contact ResistanceTCR using the inverse technique developed by Beck [1]. The semi-transparent character of the glass has been taken into account by an additional radiative heat flux and an equivalent thermal conductivity. After the set-up tests, reproducibility experiments for a specific contact pressure have been carried outwith a maximum dispersion that doesn't exceed 6%. Then, experiments under different conditions for a specific glass forming process regarding the application (Packaging, Buildings and Automobile) were carried out. The objective is to determine, experimentallyfor each application,the typical conditions capable to minimize the glass temperature loss during the glass forming process.


    Amin Salem


    Full Text Available The present investigation provides a detailed relationship between the powder composition and reliability of random ceramic beds. This evaluation is important due to standing in the liquid-gas contactors as well as predicting lifetime. It is still unclear whether the normal distribution is the most suitable function for estimation of failure. By developing the application of ceramic beds in the chemical plants, a special attention has been paid in screening strength distributions. To achieve this goal, an experimental-theoretical study was presented on compressive strength distribution. The powder compositions were prepared according to the statistical response surface methodology and then were formed by a single screw extrusion as Raschig rings. The compressive strength of specimens was measured to evaluate the strength data sets by normal and Weibull distributions. The results were analyzed by the Akaike information criterion and the Anderson-Darling test. The accuracy of distributions in prediction fracture was discussed.

  12. An approach for the condensed presentation of intuitive citation impact metrics which remain reliable with very few publications

    Campbell, D.; Tippett, Ch.; Côté, G.; Roberge, G.; Archambault, E.


    An approach for presenting citation data in a condensed and intuitive manner which will allow for their reliable interpretation by policy analysts even in cases where the number of peer-reviewed publications produced by a given entity remains small is presented. The approach is described using country level data in Agronomy & Agriculture (2004–2013), an area of specialisation for many developing countries with a small output size. Four citation impact metrics, and a synthesis graph that we call the distributional micro-charts of relative citation counts, are considered in building our “preferred” presentation layout. These metrics include two indicators that have long been used by Science-Metrix in its bibliometric reports, the Average of Relative Citations (ARC) and the percentage of publications in the 10% most cited publications in the database (HCP), as well as two newer metrics, the Median of Relative Citations (MRC) and the Relative Integration Score (RIS). The findings reveal that the proposed approach combining the MRC and HCP with the distributional micro-charts effectively allows to better qualify the citation impact of entities in terms of central location, density of the upper citation tail and overall distribution than Science-Metrix former approach based on the ARC and HCP. This is especially true of cases with small population sizes where a strong presence of outliers (denoted by strong HCP scores) can have a significant effect on the central location of the citation data when estimated with an average. (Author)

  13. YETE: Distributed, Networked Embedded Control Approaches for Efficient Reliable Mobile Systems

    Mikschl, Tobias; Hilgarth, Alexander; Kempf, Florian; Kheirkah, Ali; Tzschichholz, Tristan; Montenegro, Sergio; Schilling, Klaus


    A typical space vehicle (space robot, orbiter, lander, rover) integrates many computers, which are limited and optimized to their specific tasks. In this paper we present an alternative approach to the overall data processing architecture of these mobile systems. By using a computing cluster instead of specialized nodes and by implementing a innovative data processing concept, optimal use of processing power is achieved and the robustness of the whole system is increased. As the whole system is agnostic of the connection used between processing nodes and sensors or actuators, we will later use wireless links. This enables expansion of the system beyond the structural boundaries of the vehicle and by this construction and assembly of fractionated spacecraft.

  14. 基于试验损失的指数型产品Bayes可靠性试验设计%Bayes reliability experimental design for exponential distributed product based on experimental loss

    冯文哲; 刘琦


    The reliability demonstration experimental design for exponential distributed products was made based on Bayes theory. Considering the loss caused by the producer's risk and consumer's risk, and reliability experimental cost, the reliability experimental design model based on experimental loss (REDMEL) was constructed for the reliability experimental design of exponential distributed products. For given failure numbers, the least experimental time calculation method was presented, under the principle of minimizing the posterior expected loss, according to the 0-1 loss function. Considering average risk criteria, the formulas of the producer's risk and consumer's risk were deduced. At last, an example was given to show the validity of the method.%依据Bayes理论,对可靠性参数服从指数分布的装备产品进行了统计验证试验设计.综合考虑弃真和采伪的风险损失以及可靠性试验成本,建立了基于试验损失的可靠性验证试验设计模型.对于给定的失效数,运用0-1损失函数,按照验后期望损失最小的原则,给出了最短试验截止时间的计算方法.根据平均风险准则,推导出了弃真和采伪两类风险的计算公式.最后通过算例验证了方法的有效性.

  15. Different concepts of univentricular heart. Experimental embryological approach.

    de la Cruz, M V


    A brief historical review of the main descriptions, definitions, classifications and nomenclatures of the univentricular heart within the historical context of the evolution of the technical and scientific knowledge, is made. The development of the ventricles and particularly the posterior interventricular septum (posterior smooth plus posterior trabeculated septa), were studied experimentally in the chick embryo, by the vivo labelling technics, since they are very important for the definition of the univentricular heart. It was demonstrated: 1. In the apical region of the embryonic heart only a primitive interventricular septum develops and this gives origin to the posterior interventricular septum. 2. The walls of the bulbus cordis and the primitive ventricle adjacent to the primitive interventricular septum contribute to the development of this septum. 3. Cell division and cell movement are the basic processes of development which seem to take part in the development of the posterior interventricular septum.

  16. An Experimental Approach to Simulations of the CLIC Interaction Point

    Esberg, Jakob


    an understanding of definitions of processes and quantities that will be used throughout the rest of the thesis. The 7th chapter focuses on the parts of my work that is related to experiment. The main topic is the NA63 Trident experiment which will be discussed in detail. Results of the crystalline undulator...... experiments conducted at MAMI will be presented. Furthermore the chapter discusses the performance of new CMOS based detectors to be used in future experiments by the NA63 collaboration. The chapter on collider simulations introduces the beam-beam simulation codes GUINEA-PIG and GUINEA-PIG++, their methods...... to theoretical ones, and simulations are applied to the 3 TeV CLIC scenario. Here, experimentally based conclusions on the applicability of the theory for strong field production of pairs will be made. In the chapter on depolarization, simulations of the beam-beam depolarization will be presented. The chapter...

  17. Impact of drilling fluids on seagrasses: an experimental community approach

    Morton, R.D.; Duke, T.W.; Macauley, J.M.; Clark, J.R.; Price, W.A.


    Effects of a used drilling fluid on an experimental seagrass community (Thalassia testudinum) were measured by exposing the community to the suspended particulate phase (SPP) in laboratory microcosms. Structure of the macroinvertebrate assemblage, growth and chlorophyll content of grass and associated epiphytes, and rates of decomposition as indicated by weight loss of grass leaves in treated and untreated microcosms were compared. There were statistically significant differences in community structure and function among untreated microcosms and those receiving the clay and drilling fluid. For example, drilling fluid and clay caused a significant loss in the number of the ten most numerically abundant (dominant) macroinvertebrates, and drilling fluid decreased the rate at which Thalassia leaves decomposed.

  18. Flowability of granular materials with industrial applications - An experimental approach

    Torres-Serra, Joel; Romero, Enrique; Rodríguez-Ferran, Antonio; Caba, Joan; Arderiu, Xavier; Padullés, Josep-Manel; González, Juanjo


    Designing bulk material handling equipment requires a thorough understanding of the mechanical behaviour of powders and grains. Experimental characterization of granular materials is introduced focusing on flowability. A new prototype is presented which performs granular column collapse tests. The device consists of a channel whose design accounts for test inspection using visualization techniques and load measurements. A reservoir is attached where packing state of the granular material can be adjusted before run-off to simulate actual handling conditions by fluidisation and deaeration of the pile. Bulk materials on the market, with a wide range of particle sizes, can be tested with the prototype and the results used for classification in terms of flowability to improve industrial equipment selection processes.

  19. Inulin, oligofructose and bone health: experimental approaches and mechanisms.

    Weaver, Connie M


    Inulin-type fructans have been proposed to benefit mineral retention, thereby enhancing bone health. Many, but not all, experimental animal studies have shown increased mineral absorption by feeding non-digestible oligosaccharides. Possible reasons for inconsistencies are explored. A few studies have reported an enhanced bone mineral density or content. Bone health can be evaluated in chronic feeding studies with bone densitometry, bone breaking strength, bone mineral concentration and bone structure. Isotopic Ca tracers can be used to determine the point of metabolism affected by feeding a functional food ingredient. These methods and the effects of feeding inulin-type fructose are reviewed. Inulin-type fructans enhance Mg retention. Chicory long-chain inulin and oligofructose enhance femoral Ca content, bone mineral density and Ca retention through enhanced Ca absorption and suppressed bone turnover rates, but it is not bone-promoting under all conditions.

  20. Flowability of granular materials with industrial applications - An experimental approach

    Torres-Serra Joel


    Full Text Available Designing bulk material handling equipment requires a thorough understanding of the mechanical behaviour of powders and grains. Experimental characterization of granular materials is introduced focusing on flowability. A new prototype is presented which performs granular column collapse tests. The device consists of a channel whose design accounts for test inspection using visualization techniques and load measurements. A reservoir is attached where packing state of the granular material can be adjusted before run-off to simulate actual handling conditions by fluidisation and deaeration of the pile. Bulk materials on the market, with a wide range of particle sizes, can be tested with the prototype and the results used for classification in terms of flowability to improve industrial equipment selection processes.

  1. A Step by Step Approach for Evaluating the Reliability of the Main Engine Lube Oil System for a Ship's Propulsion System

    Mohan Anantharaman


    Full Text Available Effective and efficient maintenance is essential to ensure reliability of a ship's main propulsion system, which in turn is interdependent on the reliability of a number of associated sub- systems. A primary step in evaluating the reliability of the ship's propulsion system will be to evaluate the reliability of each of the sub- system. This paper discusses the methodology adopted to quantify reliability of one of the vital sub-system viz. the lubricating oil system, and development of a model, based on Markov analysis thereof. Having developed the model, means to improve reliability of the system should be considered. The cost of the incremental reliability should be measured to evaluate cost benefits. A maintenance plan can then be devised to achieve the higher level of reliability. Similar approach could be considered to evaluate the reliability of all other sub-systems. This will finally lead to development of a model to evaluate and improve the reliability of the main propulsion system.

  2. Reliable fault detection and diagnosis of photovoltaic systems based on statistical monitoring approaches

    Harrou, Fouzi


    This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.

  3. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew


    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  4. [The shaping of ethical approach to medical experimentation in Poland].

    Heimrath, Tadeusz


    Contemporary awareness of bioethical norms in treatment and research experiments in medicine underwent a long evolution in Poland. The process was similar to that of other countries developing this kind of research. The negative experiences of World War II resulted in tendency to curb experimentation on humans. The seventies witnessed a heated discussion leading to settlement of this problem. The argument was that stringent limitations on such experiments would stop the development of new, indispensable drugs. This discussion also reached Poland where the majority of participants were doctors and lawyers expressing their views mostly in the press. This led in consequence to the establishment of the first ethical norms in medical experiment on humans. These were expressed in subsequent codes of ethics for the medical profession as well as in the founding of the first bioethical commissions at medical schools. Currently such norms based on the so-called "good clinical practice" standards were incorporated in the legal acts that establish bioethical committees at medical schools and other medical institutes. This paper also presents some specific aspects of research on female patients.

  5. Functional complexity and ecosystem stability: an experimental approach

    Van Voris, P.; O' Neill, R.V.; Shugart, H.H.; Emanuel, W.R.


    The complexity-stability hypothesis was experimentally tested using intact terrestrial microcosms. Functional complexity was defined as the number and significance of component interactions (i.e., population interactions, physical-chemical reactions, biological turnover rates) influenced by nonlinearities, feedbacks, and time delays. It was postulated that functional complexity could be nondestructively measured through analysis of a signal generated from the system. Power spectral analysis of hourly CO/sub 2/ efflux, from eleven old-field microcosms, was analyzed for the number of low frequency peaks and used to rank the functional complexity of each system. Ranking of ecosystem stability was based on the capacity of the system to retain essential nutrients and was measured by net loss of Ca after the system was stressed. Rank correlation supported the hypothesis that increasing ecosystem functional complexity leads to increasing ecosystem stability. The results indicated that complex functional dynamics can serve to stabilize the system. The results also demonstrated that microcosms are useful tools for system-level investigations.

  6. Electrochemistry of moexipril: experimental and computational approach and voltammetric determination.

    Taşdemir, Hüdai I; Kiliç, E


    The electrochemistry of moexipril (MOE) was studied by electrochemical methods with theoretical calculations performed at B3LYP/6-31 + G (d)//AM1. Cyclic voltammetric studies were carried out based on a reversible and adsorption-controlled reduction peak at -1.35 V on a hanging mercury drop electrode (HMDE). Concurrently irreversible diffusion-controlled oxidation peak at 1.15 V on glassy carbon electrode (GCE) was also employed. Potential values are according to Ag/AgCI, (3.0 M KCI) and measurements were performed in Britton-Robinson buffer of pH 5.5. Tentative electrode mechanisms were proposed according to experimental results and ab-initio calculations. Square-wave adsorptive stripping voltammetric methods have been developed and validated for quantification of MOE in pharmaceutical preparations. Linear working range was established as 0.03-1.35 microM for HMDE and 0.2-20.0 microM for GCE. Limit of quantification (LOQ) was calculated to be 0.032 and 0.47 microM for HMDE and GCE, respectively. Methods were successfully applied to assay the drug in tablets by calibration and standard addition methods with good recoveries between 97.1% and 106.2% having relative standard deviation less than 10%.

  7. Identification of energy storage rate components. Theoretical and experimental approach

    Oliferuk, W; Maj, M, E-mail: [Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5b, 02-106 Warszawa (Poland)


    The subject of the present paper is decomposition of energy storage rate into terms related to different mode of deformation. The stored energy is the change in internal energy due to plastic deformation after specimen unloading. Hence, this energy describes the state of the cold-worked material. Whereas, the ratio of the stored energy increment to the appropriate increment of plastic work is the measure of energy conversion process. This ratio is called the energy storage rate. Experimental results show that the energy storage rate is dependent on plastic strain. This dependence is influenced by different microscopic deformation mechanisms. It has been shown that the energy storage rate can be presented as a sum of particular components. Each of them is related to the separate internal microscopic mechanism. Two of the components are identified. One of them is the storage rate of statistically stored dislocation energy related to uniform deformation. Another one is connected with non-uniform deformation at the grain level. It is the storage rate of the long range stresses energy and geometrically necessary dislocation energy. The maximum of energy storage rate, that appeared at initial stage of plastic deformation is discussed in terms of internal micro-stresses.

  8. Sparsely Sampling the Sky: A Bayesian Experimental Design Approach

    Paykari, P


    The next generation of galaxy surveys will observe millions of galaxies over large volumes of the universe. These surveys are expensive both in time and cost, raising questions regarding the optimal investment of this time and money. In this work we investigate criteria for selecting amongst observing strategies for constraining the galaxy power spectrum and a set of cosmological parameters. Depending on the parameters of interest, it may be more efficient to observe a larger, but sparsely sampled, area of sky instead of a smaller contiguous area. In this work, by making use of the principles of Bayesian Experimental Design, we will investigate the advantages and disadvantages of the sparse sampling of the sky and discuss the circumstances in which a sparse survey is indeed the most efficient strategy. For the Dark Energy Survey (DES), we find that by sparsely observing the same area in a smaller amount of time, we only increase the errors on the parameters by a maximum of 0.45%. Conversely, investing the sam...

  9. Charge Transport in LDPE Nanocomposites Part I—Experimental Approach

    Anh T. Hoang


    Full Text Available This work presents results of bulk conductivity and surface potential decay measurements on low-density polyethylene and its nanocomposites filled with uncoated MgO and Al2O3, with the aim to highlight the effect of the nanofillers on charge transport processes. Material samples at various filler contents, up to 9 wt %, were prepared in the form of thin films. The performed measurements show a significant impact of the nanofillers on reduction of material’s direct current (dc conductivity. The investigations thus focused on the nanocomposites having the lowest dc conductivity. Various mechanisms of charge generation and transport in solids, including space charge limited current, Poole-Frenkel effect and Schottky injection, were utilized for examining the experimental results. The mobilities of charge carriers were deduced from the measured surface potential decay characteristics and were found to be at least two times lower for the nanocomposites. The temperature dependencies of the mobilities were compared for different materials.

  10. Organophosphorous pesticides research in Mexico: epidemiological and experimental approaches.

    Sánchez-Guerra, M; Pérez-Herrera, N; Quintanilla-Vega, B


    Non-persistent pesticides, such as organophosphorous (OP) insecticides have been extensively used in Mexico, and becoming a public health problem. This review presents data of OP use and related toxicity from epidemiological and experimental studies conducted in Mexico. Studies in agricultural workers from several regions of the country reported moderate to severe cholinergic symptoms, including decreased acetylcholinesterase (AChE) activity (the main acute OP toxic effect that causes an over accumulation of the neurotransmitter acetylcholine), revealing the potential risk of intoxication of Mexican farmers. OP exposure in occupational settings has been associated with decreased semen quality, sperm DNA damage and as endocrine disrupter, particularly in agricultural workers. Alterations in female reproductive function have also been observed, as well as adverse effects on embryo development by prenatal exposure in agricultural communities. This illustrates that OP exposure represents a risk for reproduction and offspring well-being in Mexico. The genotoxic effects of this group of pesticides in somatic and sperm cells are also documented. Lastly, we present data about gene-environmental interactions regarding OP metabolizing enzymes, such as paraoxonase-1 (PON1) and its role in modulating their toxicity, particularly on semen quality and sperm DNA integrity. In summary, readers will see the important health problems associated with OP exposure in Mexican populations, thereby the need of capacitation programs to communicate farmers the proper handling of agrochemicals to prevent their toxic effects and of more well designed human studies to support data of the current situation of workers and communities dedicated to agriculture activities.

  11. Numerical and Experimental Approaches Toward Understanding Lava Flow Heat Transfer

    Rumpf, M.; Fagents, S. A.; Hamilton, C.; Crawford, I. A.


    We have performed numerical modeling and experimental studies to quantify the heat transfer from a lava flow into an underlying particulate substrate. This project was initially motivated by a desire to understand the transfer of heat from a lava flow into the lunar regolith. Ancient regolith deposits that have been protected by a lava flow may contain ancient solar wind, solar flare, and galactic cosmic ray products that can give insight into the history of our solar system, provided the records were not heated and destroyed by the overlying lava flow. In addition, lava-substrate interaction is an important aspect of lava fluid dynamics that requires consideration in lava emplacement models Our numerical model determines the depth to which the heat pulse will penetrate beneath a lava flow into the underlying substrate. Rigorous treatment of the temperature dependence of lava and substrate thermal conductivity and specific heat capacity, density, and latent heat release are imperative to an accurate model. Experiments were conducted to verify the numerical model. Experimental containers with interior dimensions of 20 x 20 x 25 cm were constructed from 1 inch thick calcium silicate sheeting. For initial experiments, boxes were packed with lunar regolith simulant (GSC-1) to a depth of 15 cm with thermocouples embedded at regular intervals. Basalt collected at Kilauea Volcano, HI, was melted in a gas forge and poured directly onto the simulant. Initial lava temperatures ranged from ~1200 to 1300 °C. The system was allowed to cool while internal temperatures were monitored by a thermocouple array and external temperatures were monitored by a Forward Looking Infrared (FLIR) video camera. Numerical simulations of the experiments elucidate the details of lava latent heat release and constrain the temperature-dependence of the thermal conductivity of the particulate substrate. The temperature-dependence of thermal conductivity of particulate material is not well known

  12. Chemistry of neutral species in Titan's ionosphere: an experimental approach

    Dubois, David


    Titan's gas phase atmospheric chemistry leading to the formation of solid organic aerosols can be simulated in laboratory simulations. Typically, plasma reactors can be used to achieve Titan-like conditions. The discharge induces photodissociation and photoionization processes to the N2 -CH4 mixture. It faithfully reproduces the electron energy range of magnetospheric electrons entering Titan's atmosphere and it can also approximate the solar UV input at Titan's ionosphere. In this context, it is deemed necessary to apply and exploit such a technique in order to better understand the chemical reactivity occurring in Titan-like conditions. In the present work, we use the Pampre cold dusty plasma experiment with an N2 -CH4 gaseous mixture under controlled pressure and gas influx, hence, emphasizing on the gas phase which we know is key to the formation of aerosols on Titan. An internal cryogenic trap has been developed to accumulate the gas products during their production and facilitate their detection. Those are identified and quantified by in situ mass spectrometry and infrared spectroscopy. We present here results from this experiment in an 90-10% N2 -CH4 mixing ratio, using a quantitative approach on nitriles and polycyclic aromatic hydrocarbons.

  13. Experimental Approaches to Study Genome Packaging of Influenza A Viruses

    Catherine Isel


    Full Text Available The genome of influenza A viruses (IAV consists of eight single-stranded negative sense viral RNAs (vRNAs encapsidated into viral ribonucleoproteins (vRNPs. It is now well established that genome packaging (i.e., the incorporation of a set of eight distinct vRNPs into budding viral particles, follows a specific pathway guided by segment-specific cis-acting packaging signals on each vRNA. However, the precise nature and function of the packaging signals, and the mechanisms underlying the assembly of vRNPs into sub-bundles in the cytoplasm and their selective packaging at the viral budding site, remain largely unknown. Here, we review the diverse and complementary methods currently being used to elucidate these aspects of the viral cycle. They range from conventional and competitive reverse genetics, single molecule imaging of vRNPs by fluorescence in situ hybridization (FISH and high-resolution electron microscopy and tomography of budding viral particles, to solely in vitro approaches to investigate vRNA-vRNA interactions at the molecular level.

  14. The effect of peptide adsorption on signal linearity and a simple approach to improve reliability of quantification☆

    Warwood, Stacey; Byron, Adam; Humphries, Martin J.; Knight, David


    Peptide quantification using MS often relies on the comparison of peptide signal intensities between different samples, which is based on the assumption that observed signal intensity has a linear relationship to peptide abundance. A typical proteomics experiment is subject to multiple sources of variance, so we focussed here on properties affecting peptide linearity under simple, well-defined conditions. Peptides from a standard protein digest were analysed by multiple reaction monitoring (MRM) MS to determine peptide linearity over a range of concentrations. We show that many peptides do not display a linear relationship between signal intensity and amount under standard conditions. Increasing the organic content of the sample solvent increased peptide linearity by increasing the accuracy and precision of quantification, which suggests that peptide non-linearity is due to concentration-dependent surface adsorption. Using multiple peptides at various dilutions, we show that peptide non-linearity is related to observed retention time and predicted hydrophobicity. Whereas the effect of adsorption on peptide storage has been investigated previously, here we demonstrate the deleterious effect of peptide adsorption on the quantification of fresh samples, highlight aspects of sample preparation that can minimise the effect, and suggest bioinformatic approaches to enhance the selection of peptides for quantification. Biological significance Accurate quantification is central to many aspects of science, especially those examining dynamic processes or comparing molecular stoichiometries. In biological research, the quantification of proteins is an important yet challenging objective. Large-scale quantification of proteins using MS often depends on the comparison of peptide intensities with only a single-level calibrant (as in stable isotope labelling and absolute quantification approaches) or no calibrants at all (as in label-free approaches). For these approaches to be

  15. In vivo application of an optical segment tracking approach for bone loading regimes recording in humans: a reliability study.

    Yang, Peng-Fei; Sanno, Maximilian; Ganse, Bergita; Koy, Timmo; Brüggemann, Gert-Peter; Müller, Lars Peter; Rittweger, Jörn


    This paper demonstrates an optical segment tracking (OST) approach for assessing the in vivo bone loading regimes in humans. The relative movement between retro-reflective marker clusters affixed to the tibia cortex by bone screws was tracked and expressed as tibia loading regimes in terms of segment deformation. Stable in vivo fixation of bone screws was tested by assessing the resonance frequency of the screw-marker structure and the relative marker position changes after hopping and jumping. Tibia deformation was recorded during squatting exercises to demonstrate the reliability of the OST approach. Results indicated that the resonance frequencies remain unchanged prior to and after all exercises. The changes of Cardan angle between marker clusters induced by the exercises were rather minor, maximally 0.06°. The reproducibility of the deformation angles during squatting remained small (0.04°/m-0.65°/m). Most importantly, all surgical and testing procedures were well tolerated. The OST method promises to bring more insights of the mechanical loading acting on bone than in the past.

  16. Towards a High Reliable Enforcement of Safety Regulations - A Workflow Meta Data Model and Probabilistic Failure Management Approach

    Heiko Henning Thimm


    Full Text Available Today’s companies are able to automate the enforcement of Environmental, Health and Safety (EH&S duties through the use of workflow management technology. This approach requires to specify activities that are combined to workflow models for EH&S enforcement duties. In order to meet given safety regulations these activities are to be completed correctly and within given deadlines. Otherwise, activity failures emerge which may lead to breaches against safety regulations. A novel domain-specific workflow meta data model is proposed. The model enables a system to detect and predict activity failures through the use of data about the company, failure statistics, and activity proxies. Since the detection and prediction methods are based on the evaluation of constraints specified on EH&S regulations, a system approach is proposed that builds on the integration of a Workflow Management System (WMS with an EH&S Compliance Information System. Main principles of the failure detection and prediction are described. For EH&S managers the system shall provide insights into the current failure situation. This can help to prevent and mitigate critical situations such as safety enforcement measures that are behind their deadlines. As a result a more reliable enforcement of safety regulations can be achieved.

  17. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach

    Thomas Frigeri


    Full Text Available Objective To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH for treatment of mesial temporal lobe epilepsy (MTLE. Method The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. Conclusion The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  18. Personal Reflections on Observational and Experimental Research Approaches to Childhood Psychopathology

    Rapoport, Judith L.


    The past 50 years have seen dramatic changes in childhood psychopathology research. The goal of this overview is to contrast observational and experimental research approaches; both have grown more complex such that the boundary between these approaches may be blurred. Both are essential. Landmark observational studies with long-term follow-up…

  19. Reliability and validity of neurobehavioral function on the Psychology Experimental Building Language test battery in young adults

    Brian J. Piper


    Full Text Available Background. The Psychology Experiment Building Language (PEBL software consists of over one-hundred computerized tests based on classic and novel cognitive neuropsychology and behavioral neurology measures. Although the PEBL tests are becoming more widely utilized, there is currently very limited information about the psychometric properties of these measures.Methods. Study I examined inter-relationships among nine PEBL tests including indices of motor-function (Pursuit Rotor and Dexterity, attention (Test of Attentional Vigilance and Time-Wall, working memory (Digit Span Forward, and executive-function (PEBL Trail Making Test, Berg/Wisconsin Card Sorting Test, Iowa Gambling Test, and Mental Rotation in a normative sample (N = 189, ages 18–22. Study II evaluated test–retest reliability with a two-week interest interval between administrations in a separate sample (N = 79, ages 18–22.Results. Moderate intra-test, but low inter-test, correlations were observed and ceiling/floor effects were uncommon. Sex differences were identified on the Pursuit Rotor (Cohen’s d = 0.89 and Mental Rotation (d = 0.31 tests. The correlation between the test and retest was high for tests of motor learning (Pursuit Rotor time on target r = .86 and attention (Test of Attentional Vigilance response time r = .79, intermediate for memory (digit span r = .63 but lower for the executive function indices (Wisconsin/Berg Card Sorting Test perseverative errors = .45, Tower of London moves = .15. Significant practice effects were identified on several indices of executive function.Conclusions. These results are broadly supportive of the reliability and validity of individual PEBL tests in this sample. These findings indicate that the freely downloadable, open-source PEBL battery ( is a versatile research tool to study individual differences in neurocognitive performance.

  20. Reliability and validity of neurobehavioral function on the Psychology Experimental Building Language test battery in young adults.

    Piper, Brian J; Mueller, Shane T; Geerken, Alexander R; Dixon, Kyle L; Kroliczak, Gregory; Olsen, Reid H J; Miller, Jeremy K


    Background. The Psychology Experiment Building Language (PEBL) software consists of over one-hundred computerized tests based on classic and novel cognitive neuropsychology and behavioral neurology measures. Although the PEBL tests are becoming more widely utilized, there is currently very limited information about the psychometric properties of these measures. Methods. Study I examined inter-relationships among nine PEBL tests including indices of motor-function (Pursuit Rotor and Dexterity), attention (Test of Attentional Vigilance and Time-Wall), working memory (Digit Span Forward), and executive-function (PEBL Trail Making Test, Berg/Wisconsin Card Sorting Test, Iowa Gambling Test, and Mental Rotation) in a normative sample (N = 189, ages 18-22). Study II evaluated test-retest reliability with a two-week interest interval between administrations in a separate sample (N = 79, ages 18-22). Results. Moderate intra-test, but low inter-test, correlations were observed and ceiling/floor effects were uncommon. Sex differences were identified on the Pursuit Rotor (Cohen's d = 0.89) and Mental Rotation (d = 0.31) tests. The correlation between the test and retest was high for tests of motor learning (Pursuit Rotor time on target r = .86) and attention (Test of Attentional Vigilance response time r = .79), intermediate for memory (digit span r = .63) but lower for the executive function indices (Wisconsin/Berg Card Sorting Test perseverative errors = .45, Tower of London moves = .15). Significant practice effects were identified on several indices of executive function. Conclusions. These results are broadly supportive of the reliability and validity of individual PEBL tests in this sample. These findings indicate that the freely downloadable, open-source PEBL battery ( is a versatile research tool to study individual differences in neurocognitive performance.

  1. Examples of feedback, experimental and theoretical approaches for concrete durability assessment

    Toutlemonde F.


    Full Text Available This paper presents some experimental data obtained from UHPFRC (Ultra-High Performance Fibre-Reinforced Concrete being exposed for 10 years in a cooling tower and a high slag content concrete being exposed for 30 years in a marine environment. Experimental data are then used for assessing concrete durability through a theoretical approach, namely performance-based analysis. The results from the application of this approach are consistent with the penetration depth of aggressive agents measured from core samples. Finally a simulation method currently being developed by EDF is presented, which has great relevance to durability assessment.

  2. System of systems design: Evaluating aircraft in a fleet context using reliability and non-deterministic approaches

    Frommer, Joshua B.

    , the analysis of the vehicle for high-speed attack combined with a long loiter period is considerably different from that for quick cruise to an area combined with a low speed search. However, the framework developed to solve this class of system-of-systems problem handles both scenarios and leads to a solution type for this kind of problem. On the vehicle-level of the problem, different technology can have an impact on the fleet-level. One such technology is Morphing, the ability to change shape, which is an ideal candidate technology for missions with dissimilar segments, such as the aforementioned two. A framework, using surrogate models based on optimally-sized aircraft, and using probabilistic parameters to define a concept of operations, is investigated; this has provided insight into the setup of the optimization problem, the use of the reliability metric, and the measurement of fleet level impacts of morphing aircraft. The research consisted of four phases. The two initial phases built and defined the framework to solve system-of-systems problem; these investigations used the search-and-find scenario as the example application. The first phase included the design of fixed-geometry and morphing aircraft for a range of missions and evaluated the aircraft capability using non-deterministic mission parameters. The second phase introduced the idea of multiple aircraft in a fleet, but only considered a fleet consisting of one aircraft type. The third phase incorporated the simultaneous design of a new vehicle and allocation into a fleet for the search-and-find scenario; in this phase, multiple types of aircraft are considered. The fourth phase repeated the simultaneous new aircraft design and fleet allocation for the SEAD scenario to show that the approach is not specific to the search-and-find scenario. The framework presented in this work appears to be a viable approach for concurrently designing and allocating constituents in a system, specifically aircraft in a

  3. Power electronics reliability analysis.

    Smith, Mark A.; Atcitty, Stanley


    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  4. An approach to experimental evaluation of real-time fault-tolerant distributed computing schemes

    Kim, K. H.


    A testbed-based approach to the evaluation of fault-tolerant distributed computing schemes is discussed. The approach is based on experimental incorporation of system structuring and design techniques into real-time distributed-computing testbeds centered around tightly coupled microcomputer networks. The effectiveness of this approach has been experimentally confirmed. Primary advantages of this approach include the accuracy of the timing and logical-complexity data and the degree of assurance of the practical effectiveness of the scheme evaluated. Various design issues encountered in the course of establishing the network testbed facilities are discussed, along with their augmentation to support some experiments. The shortcomings of the testbeds are also discussed together with the desired extensions of the testbeds.

  5. A New Metamodeling Approach for Time-dependent Reliability of Dynamic Systems with Random Parameters Excited by Input Random Processes


    Simulation-based Time-dependent Reliability Analysis for Composite Hydrokinetic Turbine Blades,” Structural and Multidisciplinary Optimization...Genetic Algorithm,” ASME Journal of Mechanical Design, 131(7). 13. Hu, Z., and Du, X., 2012, “Reliability Analysis for Hydrokinetic Turbine Seismic Risk Based on Dynamic Analysis,” Journal of Engineering Mechanics, 129, 901- 917. 19. Beck, J. L., and Au, S. K., 2002, “Bayesian Updating

  6. Combining Best-Practice and Experimental Approaches: Redundancy, Images, and Misperceptions in Multimedia Learning

    Fenesi, Barbara; Heisz, Jennifer J.; Savage, Philip I.; Shore, David I.; Kim, Joseph A.


    This experiment combined controlled experimental design with a best-practice approach (i.e., real course content, subjective evaluations) to clarify the role of verbal redundancy, confirm the multimodal impact of images and narration, and highlight discrepancies between actual and perceived understanding. The authors presented 1 of 3…

  7. A semantic web approach applied to integrative bioinformatics experimentation: a biological use case with genomics data.

    Post, L.J.G.; Roos, M.; Marshall, M.S.; van Driel, R.; Breit, T.M.


    The numerous public data resources make integrative bioinformatics experimentation increasingly important in life sciences research. However, it is severely hampered by the way the data and information are made available. The semantic web approach enhances data exchange and integration by providing

  8. Experimental semiotics: a new approach for studying communication as a form of joint action.

    Galantucci, Bruno


    In the last few years, researchers have begun to investigate the emergence of novel forms of human communication in the laboratory. I survey this growing line of research, which may be called experimental semiotics, from three distinct angles. First, I situate the new approach in its theoretical and historical context. Second, I review a sample of studies that exemplify experimental semiotics. Third, I present an empirical study that illustrates how the new approach can help us understand the socio-cognitive underpinnings of human communication. The main conclusion of the paper will be that, by reproducing micro samples of historical processes in the laboratory, experimental semiotics offers new powerful tools for investigating human communication as a form of joint action.

  9. A comparative experimental study on structural and interface damping approaches for vibration suppression purposes

    Liu, Yi; Sanchez, Alberto; Zogg, Markus; Ermanni, Paolo


    Dynamic loadings in automotive structures may lead to reduction of driving comfort and even to failure of the components. Damping treatments are applied in order to attenuate the vibrations and improve the long term fatigue behavior of the structures. This experimental study is targeting applications in floor panels that are mounted to the loadcarrying primary structure of the vehicle. The objective is to reach outstanding damping performance considering the stringent weight and cost requirement in the automotive industry. An experimental setup has been developed and validated for the determination of the damping properties of structural specimens also considering interface damping effects. This contribution is structured in three main parts: test rig design, experimental results and discussion. Reliable and easy-to-use devices for the characterization of the damping properties of specimens between 200×40 mm2 and 400×400 mm2 are not available "on the shelf". In this context, we present a flexible experimental set-up which has been realized to (1) support the development of novel damping solutions for multi-functional composite structures; (2) characterize the loss-factor of the different damping concepts, including boundary effects. A variety of novel passive and active damping treatments have been investigated including viscoelastic, coulomb, magnetorheological (MR), particle, magnetic and eddy current damping. The particle, interface as well as active damping systems show promising performance in comparison to the classical viscoelastic treatments.

  10. Análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales//Reliability and risk analysis of an experimental set-up for wastewater treatment

    María Adelfa Abreu‐Zamora


    Full Text Available Una de las exigencias modernas para el uso de equipos en todas las ramas de la economía, la ciencia y la educación, es su explotación segura. En este trabajo, se realizó el análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales con radiación ultravioleta. Se empleó la técnica del árbol de fallos y se analizaron dos variantes de cálculo. La primera variante consideró fuentes no confiables de suministro de energía eléctrica y la segunda consideró la existencia de fuentes confiables. Como resultado se identificaron 20 conjuntos mínimos de corte, 12 de primer orden y 8 de tercer orden. Además, se infirió, la necesidad de contar con una fuente alternativa de electricidad y que esimportante establecer redundancia de grupo de componentes para instalaciones a escala industrial. El análisis demostró que la instalación es segura para su uso en la experimentación a escala de laboratorio.Palabras claves: confiabilidad, riesgo, instalación experimental, tratamiento de aguas residuales, radiación ultravioleta, árbol de fallos.______________________________________________________________________________AbstractOne of the modern requirements for using equipments in all the areas of economy, science and education, is its safety operation. In this work, it was carried out the reliability and risk analysis of an experimental setup for the wastewater treatment with ultraviolet radiation. The fault tree technique was used and two variants of calculation were analyzed. The first variant considered unreliable sources of electricity supply and the second considered the existence of reliable sources. As a result, 20 minimal cut sets were identified 12 of first-order and 8 of third-order. Besides, the necessity of an alternative supply electrical power source was inferred and it is important to establish redundant components group for industrial scalefacilities. The analysis demostrated the set

  11. Geomorphological Dating Using an Improved Scarp Degradation Model: Is This a Reliable Approach Compared With Common Absolute Dating Methods?

    Oemisch, M.; Hergarten, S.; Neugebauer, H. J.


    Geomorphological dating of a certain landform or geomorphological structure is based on the evolution of the landscape itself. In this context it is difficult to use common absolute dating techniques such as luminescence and radiocarbon dating because they require datable material which is often not available. Additionally these methods do not always date the time since the formation of these structures. For these reasons the application of geomorphological dating seems one reliable possibility to date certain geomorphological features. The aim of our work is to relate present-day shapes of fault scarps and terrace risers to their ages. The time span since scarp formation ceased is reflected by the stage of degradation as well as the rounding of the profile edges due to erosive processes. It is assumed that the average rate of downslope soil movement depends on the local slope angle and can be described in terms of a diffusion equation. On the basis of these assumptions we present a model to simulate the temporal development of scarp degradation by erosion. A diffusivity reflecting the effects of soil erosion, surface runoff and detachability of particles as well as present-day shapes of scarps are included in the model. As observations of present-day scarps suggest a higher diffusivity at the toe than at the head of a slope, we suggest a linear approach with increasing diffusivities in downslope direction. First results show a better match between simulated and observed profiles of the Upper Rhine Graben in comparison to models using a constant diffusivity. To date the scarps the model has to be calibrated. For this purpose we estimate diffusivities by fitting modelled profiles to observed ones of known age. Field data have been collected in the area around Bonn, Germany and in the Alps, Switzerland. It is a matter of current research to assess the quality of this dating technique and to compare the results and the applicability with some of the absolute dating

  12. Superiority of preventive antibiotic treatment compared with standard treatment of poststroke pneumonia in experimental stroke: a bed to bench approach

    Hetze, Susann; Engel, Odilo; Römer, Christine; Mueller, Susanne; Dirnagl, Ulrich; Meisel, Christian; Meisel, Andreas


    Stroke patients are prone to life-threatening bacterial pneumonia. Previous experimental stroke studies have demonstrated that preventive antibiotic treatment (PAT) improves outcome compared with placebo treatment, which however does not model the clinical setting properly. Here we investigate whether PAT is superior to the current clinical ‘gold standard' for treating poststroke infections. Therefore, we modeled stroke care according to the current stroke guidelines recommending early antibiotic treatment after diagnosing infections. To reliably diagnose pneumonia in living mice, we established a general health score and a magnetic resonance imaging protocol for radiologic confirmation. Compared with standard treatment after diagnosis by these methods, PAT not only abolished pneumonia successfully but also improved general medical outcome. Both, preventive and standard antibiotic treatment using enrofloxacin improved survival in a similar way compared with placebo treatment. However, in contrast to standard treatment, only PAT improved functional outcome assessed by gait analysis. In conclusion, standard and preventive treatment approach reduced poststroke mortality, however at the cost of a worse neurologic outcome compared with preventive approach. These data support the concept of PAT for treating patients at risk for poststroke infections and warrant phase III trials to prove this concept in clinical setting. PMID:23361393

  13. Coupling damage and reliability model of low-cycle fatigue and high energy impact based on the local stress-strain approach

    Chen Hongxia; Chen Yunxia; Yang Zhou


    Fatigue induced products generally bear fatigue loads accompanied by impact processes, which reduces their reliable life rapidly. This paper introduces a reliability assessment model based on a local stress-strain approach considering both low-cycle fatigue and high energy impact loads. Two coupling relationships between fatigue and impact are given with effects of an impact process on fatigue damage and effects of fatigue damage on impact performance. The analysis of the former modifies the fatigue parameters and the Manson-Coffin equation for fatigue life based on material theories. On the other hand, the latter proposes the coupling variables and the difference of fracture toughness caused by accumulative fatigue damage. To form an overall reliability model including both fatigue failure and impact failure, a competing risk model is developed. A case study of an actuator cylinder is given to validate this method.


    Arsalan Farooq, Muhammad


    Full Text Available This paper addresses the study of the pre-experimental planning phase of the Design of Experiments (DoE in order to improve the final product quality. The pre-experimental planning phase includes a clear identification of the problem statement, selection of control factors and their respective levels and ranges. To improve production quality based on the DoE a new approach for the pre-experimental planning phase, called Non-Conformity Matrix (NCM, is presented. This article also addresses the key steps of the pre-experimental runs considering a consumer goods manufacturing process. Results of the application for an industrial case show thatthis methodology can support a clear definition ofthe problem and also a correct identification of the factor ranges in particular situations. The proposed new approach allows modeling the entire manufacturing system holistically and correctly defining the factor ranges and respective levels for a more effective application of DoE. This new approach can be a useful resource for both research and industrial practitioners who are dedicated to large DoE projects with unknown factor interactions, when the operational levels and ranges are not completely defined.

  15. Adaptive combinatorial design to explore large experimental spaces: approach and validation.

    Lejay, L V; Shasha, D E; Palenchar, P M; Kouranov, A Y; Cruikshank, A A; Chou, M F; Coruzzi, G M


    Systems biology requires mathematical tools not only to analyse large genomic datasets, but also to explore large experimental spaces in a systematic yet economical way. We demonstrate that two-factor combinatorial design (CD), shown to be useful in software testing, can be used to design a small set of experiments that would allow biologists to explore larger experimental spaces. Further, the results of an initial set of experiments can be used to seed further 'Adaptive' CD experimental designs. As a proof of principle, we demonstrate the usefulness of this Adaptive CD approach by analysing data from the effects of six binary inputs on the regulation of genes in the N-assimilation pathway of Arabidopsis. This CD approach identified the more important regulatory signals previously discovered by traditional experiments using far fewer experiments, and also identified examples of input interactions previously unknown. Tests using simulated data show that Adaptive CD suffers from fewer false positives than traditional experimental designs in determining decisive inputs, and succeeds far more often than traditional or random experimental designs in determining when genes are regulated by input interactions. We conclude that Adaptive CD offers an economical framework for discovering dominant inputs and interactions that affect different aspects of genomic outputs and organismal responses.

  16. Learning and unlearning dignity in care: Experiential and experimental educational approaches.

    Kyle, Richard G; Medford, Wayne; Blundell, Julie; Webster, Elaine; Munoz, Sarah-Anne; Macaden, Leah


    Guarding against loss of human dignity is fundamental to nursing practice. It is assumed in the existing literature that 'dignity' as a concept and 'dignity in care' as a practice is amenable to education. Building on this assumption, a range of experiential and experimental educational approaches have been used to enhance students' understanding of dignity. However, little is known about student nurses' views on whether dignity is amenable to education and, if so, which educational approaches would be welcomed. This mixed-methods study used an online questionnaire survey and focus groups to address these questions. Student nurses in Scotland completed online questionnaires (n = 111) and participated in focus groups (n = 35). Students concluded that education has transformative potential to encourage learning around the concept of dignity and practice of dignity in care but also believed that dignity could be unlearned through repeated negative practice exposures. Experiential and experimental educational approaches were welcomed by student nurses, including patient testimony, role-play, simulation, and empathy exercises to step into the lives of others. Nurse educators should further integrate experiential and experimental educational approaches into undergraduate and postgraduate nursing curricula to guard against the loss of learning around dignity students believed occurred over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Risk-Control Approach for a Bottleneck Spanning Tree Problem with the Total Network Reliability under Uncertainty

    Takashi Hasuike


    parameters to edge costs is introduced as objective functions in the risk-control. Furthermore, in order to maintain the constructing spanning tree network entirely, the reliability for each edge is introduced, and maximizing the total reliability of spanning tree is assumed as the third objective function. The proposed model is a multiobjective programming problem, and hence, it is difficult to solve it directly without setting some optimal criterion. Therefore, satisfaction functions for each object and the integrated function are introduced, and the exact solution algorithm is developed by performing deterministic equivalent transformations. A numerical example is provided by comparing our proposed model with previous standard models.

  18. Inhibition of the ferric uptake regulator by peptides derived from anti-FUR peptide aptamers: coupled theoretical and experimental approaches.

    Cissé, Cheickna; Mathieu, Sophie V; Abeih, Mohamed B Ould; Flanagan, Lindsey; Vitale, Sylvia; Catty, Patrice; Boturyn, Didier; Michaud-Soret, Isabelle; Crouzy, Serge


    The FUR protein (ferric uptake regulator) is an iron-dependent global transcriptional regulator. Specific to bacteria, FUR is an attractive antibacterial target since virulence is correlated to iron bioavailability. Recently, four anti-FUR peptide aptamers, composed of 13 amino acid variable loops inserted into a thioredoxinA scaffold, were identified, which were able to interact with Escherichia coli FUR (EcFUR), inhibit its binding to DNA and to decrease the virulence of pathogenic E. coli in a fly infection model. The first characterization of anti-FUR linear peptides (pF1 6 to 13 amino acids) derived from the variable part of the F1 anti-FUR peptide aptamer is described herein. Theoretical and experimental approaches, in original combination, were used to study interactions of these peptides with FUR in order to understand their mechanism of inhibition. After modeling EcFUR by homology, docking with Autodock was combined with molecular dynamics simulations in implicit solvent to take into account the flexibility of the partners. All calculations were cross-checked either with other programs or with experimental data. As a result, reliable structures of EcFUR and its complex with pF1 are given and an inhibition pocket formed by the groove between the two FUR subunits is proposed. The location of the pocket was validated through experimental mutation of key EcFUR residues at the site of proposed peptide interaction. Cyclisation of pF1, mimicking the peptide constraint in F1, improved inhibition. The details of the interactions between peptide and protein were analyzed and a mechanism of inhibition of these anti-FUR molecules is proposed.

  19. Experimental study on the initiated reliability of nonel tube by industrial detonator%雷管起爆导爆管能力的实验研究



    Non-electrical initiation system has been widely applied in blasting projects. So how to keep the safety and reliability of industrial detonators initiating the nonel tubes becomes very important. The article carried out the simulation experiment of the nonel tubes initiated by industrial detonators of series 8. The experiment investigated the initiating probability under the condition of the front and back shooting methods and under the condition of different enlacing forces. Also, the reliability distance of initiating the nonel tubes and the safety distance of not destroying the nonel tubes were put forward. The experimental results showed that:(1) the initiating probability revealed a smal difference when applying the front or back shooting methods. But the front shooting method may exit the thing that the metal jet and detonator lfyer cut off the nonel tubes;(2) the bigger enlacing force of the nonel tubes was, the bigger number of effective initiating nonel tubes was, that is to say, the initiating probability is higher;(3) the reliability distance of the detonators initiated the nonel tubes was less than 1cm, and the safety distance of the detonators not destroying the nonel tubes was higher than 9cm. The experimental results and analyses in this article can be regarded as a reference for analyzing the safety and reliability of non-electrical initiation system.%非电起爆系统在爆破工程中广泛应用,工业雷管如何安全、可靠起爆导爆管至关重要。本文开展了8号工业雷管起爆导爆管能力的模拟实验,分别对雷管正、反向起爆导爆管情形,不同捆扎约束力情况下导爆管引爆概率进行了研究,并对雷管起爆导爆管的可靠距离和不破坏临近导爆管的安全距离进行了研究。实验结果表明:(1)雷管正、反向起爆导爆管对起爆概率影响不大,但正向起爆存在金属射流或破片切断导爆管的可能;(2)导爆管的捆扎约束力越大,有

  20. Experimental and Numerical Analysis of Triaxially Braided Composites Utilizing a Modified Subcell Modeling Approach

    Cater, Christopher; Xiao, Xinran; Goldberg, Robert K.; Kohlman, Lee W.


    A combined experimental and analytical approach was performed for characterizing and modeling triaxially braided composites with a modified subcell modeling strategy. Tensile coupon tests were conducted on a [0deg/60deg/-60deg] braided composite at angles of 0deg, 30deg, 45deg, 60deg and 90deg relative to the axial tow of the braid. It was found that measured coupon strength varied significantly with the angle of the applied load and each coupon direction exhibited unique final failures. The subcell modeling approach implemented into the finite element software LS-DYNA was used to simulate the various tensile coupon test angles. The modeling approach was successful in predicting both the coupon strength and reported failure mode for the 0deg, 30deg and 60deg loading directions. The model over-predicted the strength in the 90deg direction; however, the experimental results show a strong influence of free edge effects on damage initiation and failure. In the absence of these local free edge effects, the subcell modeling approach showed promise as a viable and computationally efficient analysis tool for triaxially braided composite structures. Future work will focus on validation of the approach for predicting the impact response of the braided composite against flat panel impact tests.

  1. Comparing Conventional Bank Credit Vis A Vis Shariah Bank Musharakah: Experimental Economic Approach

    Muhamad Abduh


    Full Text Available Central Bank of Indonesia with dual banking system – i.e Shariah and Conventional Bank – keep on developing system that considered as an answer to generate the national economic growth. One of the banking activities that emphasized by the Central Bank of Indonesia is fund distribution through either conventional bank credit or shariah bank fi nancing. Having the Experimental Economic Approach based on Induced Value Theory and employing ANOVA, this paper found that shariah bank musharakah fi nancing system would come up with higher profi t opportunity compare to conventional credit system. One main reason is that musharakah fi nancing in shariah bank applies profi t and lost sharing (PLS scheme so that will not be a burden to the customer when he fi nd low profi t.Keywords: Credit Loan, Musharakah Financing, Induced Value Theory, Experimental Economic Approach, Analysis of Variance (ANOVA.

  2. Study of Photovoltaic Energy Storage by Supercapacitors through Both Experimental and Modelling Approaches

    Pierre-Olivier Logerais


    Full Text Available The storage of photovoltaic energy by supercapacitors is studied by using two approaches. An overview on the integration of supercapacitors in solar energy conversion systems is previously provided. First, a realized experimental setup of charge/discharge of supercapacitors fed by a photovoltaic array has been operated with fine data acquisition. The second approach consists in simulating photovoltaic energy storage by supercapacitors with a faithful and accessible model composed of solar irradiance evaluation, equivalent electrical circuit for photovoltaic conversion, and a multibranch circuit for supercapacitor. Both the experimental and calculated results are confronted, and an error of 1% on the stored energy is found with a correction largely within ±10% of the transmission line capacitance according to temperature.

  3. Aging and fear of crime: an experimental approach to an apparent paradox.

    Ziegler, Raphael; Mitchell, David B


    Many fear of crime studies have revealed an interesting paradox: Although older adults are less likely to be victims, they report a higher fear of crime than younger adults. In this study, we experimentally manipulated vicarious exposure to crime. Younger (ages 18-29) and older adults (ages 61-78) were randomly assigned to view either a vivid video reenactment of a violent crime or a crime report newscast. Subjects in the violent video condition demonstrated significantly higher fear than did control group participants, but this effect was reliable only for younger adults. The older adults appeared to be unfazed by the violent video, and reported significantly less fear than the younger group. This could not be explained away on the basis of age group differences in neighborhood crime rates, victimization experience, or media exposure. Thus, when greater fear of crime is found in older adults, "old age" per se is not the cause.

  4. Treatment of secondary burn wound progression in contact burns-a systematic review of experimental approaches.

    Schmauss, Daniel; Rezaeian, Farid; Finck, Tom; Machens, Hans-Guenther; Wettstein, Reto; Harder, Yves


    After a burn injury, superficial partial-thickness burn wounds may progress to deep partial-thickness or full-thickness burn wounds, if kept untreated. This phenomenon is called secondary burn wound progression or conversion. Burn wound depth is an important determinant of patient morbidity and mortality. Therefore, reduction or even the prevention of secondary burn wound progression is one goal of the acute care of burned patients. The objective of this study was to review preclinical approaches evaluating therapies to reduce burn wound progression. A systematic review of experimental approaches in animals that aim at reducing or preventing secondary burn wound progression was performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analysis (PRISMA) guidelines. The selected references consist of all the peer-reviewed studies performed in vivo in animals and review articles published in English, German, Italian, Spanish, or French language relevant to the topic of secondary burn wound progression. We searched MEDLINE, Cochrane Library, and Google Scholar including all the articles published from the beginning of notations to the present. The search was conducted between May 3, 2012 and December 26, 2013. We included 29 experimental studies in this review, investigating agents that maintain or increase local perfusion conditions, as well as agents that exhibit an anti-coagulatory, an anti-inflammatory, or an anti-apoptotic property. Warm water, simvastatin, EPO, or cerium nitrate may represent particularly promising approaches for the translation into clinical use in the near future. This review demonstrates promising experimental approaches that might reduce secondary burn wound progression. Nevertheless, a translation into clinical application needs to confirm the results compiled in experimental animal studies.

  5. Quantitative photoacoustic assessment of red blood cell aggregation under pulsatile blood flow: experimental and theoretical approaches

    Bok, Tae-Hoon; Hysi, Eno; Kolios, Michael C.


    In the present paper, the optical wavelength dependence on the photoacoustic (PA) assessment of the pulsatile blood flow was investigated by means of the experimental and theoretical approaches analyzing PA radiofrequency spectral parameters such as the spectral slope (SS) and mid-band fit (MBF). For the experimental approach, the pulsatile flow of human whole blood at 60 bpm was imaged using the VevoLAZR system (40-MHz-linear-array probe, 700-900 nm illuminations). For the theoretical approach, a Monte Carlo simulation for the light transmit into a layered tissue phantom and a Green's function based method for the PA wave generation was implemented for illumination wavelengths of 700, 750, 800, 850 and 900 nm. The SS and MBF for the experimental results were compared to theoretical ones as a function of the illumination wavelength. The MBF increased with the optical wavelength in both theory and experiments. This was expected because the MBF is representative of the PA magnitude, and the PA signal from red blood cell (RBC) is dependent on the molar extinction coefficient of oxyhemoglobin. On the other hand, the SS decreased with the wavelength, even though the RBC size (absorber size which is related to the SS) cannot depend on the illumination wavelength. This conflicting result can be interpreted by means of the changes of the fluence pattern for different illumination wavelengths. The SS decrease with the increasing illumination wavelength should be further investigated.

  6. Comparison of repulsive interatomic potentials calculated with an all-electron DFT approach with experimental data

    Zinoviev, A. N.; Nordlund, K.


    The interatomic potential determines the nuclear stopping power in materials. Most ion irradiation simulation models are based on the universal Ziegler-Biersack-Littmark (ZBL) potential (Ziegler et al., 1983), which, however, is an average and hence may not describe the stopping of all ion-material combinations well. Here we consider pair-specific interatomic potentials determined experimentally and by density-functional theory simulations with DMol approach (DMol software, 1997) to choose basic wave functions. The interatomic potentials calculated using the DMol approach demonstrate an unexpectedly good agreement with experimental data. Differences are mainly observed for heavy atom systems, which suggests they can be improved by extending a basis set and more accurately considering the relativistic effects. Experimental data prove that the approach of determining interatomic potentials from quasielastic scattering can be successfully used for modeling collision cascades in ion-solids collisions. The data obtained clearly indicate that the use of any universal potential is limited to internuclear distances R < 7 af (af is the Firsov length).

  7. New approaches for the reliability-oriented structural optimization considering time-variant aspects; Neue Ansaetze fuer die zuverlaessigkeitsorientierte Strukturoptimierung unter Beachtung zeitvarianter Aspekte

    Kuschel, N.


    The optimization of structures with respect to cost, weight or performance is a well-known application of the nonlinear optimization. However reliability-based structural optimization has been subject of only very few studies. The approaches suggested up to now have been unsatisfactory regarding general possibility of application or easy handling by user. The objective of this thesis is the development of general approaches to solve both optimization problems, the minimization of cost with respect to constraint reliabilty and the maximization of reliability under cost constraint. The extented approach of an one-level-method will be introduced in detail for the time-invariant problems. Here, the reliability of the sturcture will be analysed in the framework of the First-Order-Reliability-Method (FORM). The use of time-variant reliability analysis is necessary for a realistic modelling of many practical problems. Therefore several generalizations of the new approaches will be derived for the time-variant reliability-based structural optimization. Some important properties of the optimization problems are proved. In addition some interesting extensions of the one-level-method, for example the cost optimization of structural series systems and the cost optimization in the frame of the Second-Order-Reliabiity-Method (SORM), are presented in the thesis. (orig.) [German] Die Optimierung von Tragwerken im Hinblick auf die Kosten, das Gewicht oder die Gestalt ist eine sehr bekannte Anwendung der nichtlinearen Optimierung. Die zuverlaessigkeitsorientierte Strukturoptimierung wurde dagegen weit seltener untersucht. Die bisher vorgeschlagenen Ansaetze koennen bezueglich ihrer allgemeinen Verwendbarkeit oder ihrer nutzerfreundlichen Handhabung nicht befriedigen. Das Ziel der vorliegenden Arbeit ist nun die Entwicklung allgemeiner Ansaetze zur Loesung der beiden Optimierungsprobleme, einer Kostenminimierung unter Zuverlaessigkeitsrestriktionen und einer

  8. Processing and Device Oriented Approach to CIGS Module Reliability; SunShot Initiative, U.S. Department of Energy (DOE)

    Ramanathan, K.; Mansfield, L.; Garris, R.; Deline, C.; Silverman, T.


    Abstract: A device level understanding of thin film module reliability has been lacking. We propose that device performance and stability issues are strongly coupled and simultaneous attention to both is necessary. Commonly discussed technical issues such as light soaking, metastability, reverse bias breakdown and junction breakdown can be understood by comparing the behaviors of cells made inAbstract: A device level understanding of thin film module reliability has been lacking. We propose that device performance and stability issues are strongly coupled and simultaneous attention to both is necessary. Commonly discussed technical issues such as light soaking, metastability, reverse bias breakdown and junction breakdown can be understood by comparing the behaviors of cells made in the laboratory and industry. It will then be possible to attribute the observed effects in terms of processing and cell design. Process connection to stability studies can help identify root causes and a path for mitigating the degradation.

  9. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.


    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  10. Consolidating guided wave simulations and experimental data: a dictionary learning approach

    Alguri, K. Supreet; Harley, Joel B.


    Modeling and simulating guided wave propagation in complex, geometric structures is a topic of significant interest in structural health monitoring. These models have the potential to benefit damage detection, localization, and characterization in structures where traditional algorithms fail. Numerical modelling (for example, using finite element or semi-analytical finite element methods) is a popular approach for simulating complex wave behavior. Yet, using these models to improve experimental data analysis remains difficult. Numerical simulations and experimental data rarely match due to uncertainty in the properties of the structures and the guided waves traveling within them. As a result, there is a significant need to reduce this uncertainty by incorporating experimental data into the models. In this paper, we present a dictionary learning framework to address this challenge. Specifically, use dictionary learning to combine numerical wavefield simulations with 24 simulated guided wave measurements with different frequency-dependent velocity characteristics (emulating an experimental system) to make accurate, global predictions about experimental wave behavior. From just 24 measurements, we show that we can predict and extrapolate guided wave behavior with accuracies greater than 92%.

  11. MEMS reliability

    Hartzell, Allyson L; Shea, Herbert R


    This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.

  12. Real-Time Smart Grids Control for Preventing Cascading Failures and Blackout using Neural Networks: Experimental Approach for N-1-1 Contingency

    Zarrabian, Sina; Belkacemi, Rabie; Babalola, Adeniyi A.


    In this paper, a novel intelligent control is proposed based on Artificial Neural Networks (ANN) to mitigate cascading failure (CF) and prevent blackout in smart grid systems after N-1-1 contingency condition in real-time. The fundamental contribution of this research is to deploy the machine learning concept for preventing blackout at early stages of its occurrence and to make smart grids more resilient, reliable, and robust. The proposed method provides the best action selection strategy for adaptive adjustment of generators' output power through frequency control. This method is able to relieve congestion of transmission lines and prevent consecutive transmission line outage after N-1-1 contingency condition. The proposed ANN-based control approach is tested on an experimental 100 kW test system developed by the authors to test intelligent systems. Additionally, the proposed approach is validated on the large-scale IEEE 118-bus power system by simulation studies. Experimental results show that the ANN approach is very promising and provides accurate and robust control by preventing blackout. The technique is compared to a heuristic multi-agent system (MAS) approach based on communication interchanges. The ANN approach showed more accurate and robust response than the MAS algorithm.

  13. Approaches to experimental validation of high-temperature gas-cooled reactor components

    Belov, S.E. [Joint Stock Company ' Afrikantov OKB Mechanical Engineering' , Burnakovsky Proezd, 15, Nizhny Novgorod 603074 (Russian Federation); Borovkov, M.N., E-mail: [Joint Stock Company ' Afrikantov OKB Mechanical Engineering' , Burnakovsky Proezd, 15, Nizhny Novgorod 603074 (Russian Federation); Golovko, V.F.; Dmitrieva, I.V.; Drumov, I.V.; Znamensky, D.S.; Kodochigov, N.G. [Joint Stock Company ' Afrikantov OKB Mechanical Engineering' , Burnakovsky Proezd, 15, Nizhny Novgorod 603074 (Russian Federation); Baxi, C.B.; Shenoy, A.; Telengator, A. [General Atomics, 3550 General Atomics Court, CA (United States); Razvi, J., E-mail: [General Atomics, 3550 General Atomics Court, CA (United States)


    Highlights: Black-Right-Pointing-Pointer Computational and experimental investigations of thermal and hydrodynamic characteristics for the equipment. Black-Right-Pointing-Pointer Vibroacoustic investigations. Black-Right-Pointing-Pointer Studies of the electromagnetic suspension system on GT-MHR turbo machine rotor models. Black-Right-Pointing-Pointer Experimental investigations of the catcher bearings design. - Abstract: The special feature of high-temperature gas-cooled reactors (HTGRs) is stressed operating conditions for equipment due to high temperature of the primary circuit helium, up to 950 Degree-Sign C, as well as acoustic and hydrodynamic loads upon the gas path elements. Therefore, great significance is given to reproduction of real operation conditions in tests. Experimental investigation of full-size nuclear power plant (NPP) primary circuit components is not practically feasible because costly test facilities will have to be developed for the power of up to hundreds of megawatts. Under such conditions, the only possible process to validate designs under development is representative tests of smaller scale models and fragmentary models. At the same time, in order to take in to validated account the effect of various physical factors, it is necessary to ensure reproduction of both individual processes and integrated tests incorporating needed integrated investigations. Presented are approaches to experimental validation of thermohydraulic and vibroacoustic characteristics for main equipment components and primary circuit path elements under standard loading conditions, which take account of their operation in the HTGR. Within the framework of the of modular helium reactor project, including a turbo machine in the primary circuit, a new and difficult problem is creation of multiple-bearing flexible vertical rotor. Presented are approaches to analytical and experimental validation of the rotor electromagnetic bearings, catcher bearings, flexible rotor

  14. A novel approach to the control of experimental environments: the ESCA microscopy data-acquisition system at ELETTRA.

    Pugliese, R; Gregoratti, L; Krempska, R; Billè, F; Krempasky, J; Marsi, M; Abrami, A


    An efficient control system is today one of the key points for the successful operation of a beamline at third-generation synchrotron radiation sources. The high cost of these ultra-bright light sources and the limited beam time requires effective instrument handling in order to reduce any waste of measurement time. The basic requirements for such control software are reliability, user-friendliness, modularity, upgradability, as well as the capability of integrating a horde of different instruments, commercial tools and independent pre-existing systems in a possibly distributed environment. A novel approach has been adopted to implement the data-acquisition system of the ESCA microscopy beamline at ELETTRA. The system is based on YASB, a software bus, i.e. an underlying control model to coordinate information exchanges and networking software to implement that model. This 'middleware' allows the developer to model applications as a set of interacting agents, i.e. independent software machines. Agents can be implemented using different programming languages and be executed on heterogeneous operating environments, which promotes an effective collaboration between software engineers and experimental physicists.

  15. Improved microbial conversion of de-oiled Jatropha waste into biohydrogen via inoculum pretreatment: process optimization by experimental design approach

    Gopalakrishnan Kumar


    Full Text Available In this study various pretreatment methods of sewage sludge inoculum and the statistical process optimization of de-oiled jatropha waste have been reported. Peak hydrogen production rate (HPR and hydrogen yield (HY of 0.36 L H2/L-d and 20 mL H2/g Volatile Solid (VS were obtained when heat shock pretreatment (95 oC, 30 min was employed. Afterwards, an experimental design was applied to find the optimal conditions for H2 production using heat-pretreated seed culture. The optimal substrate concentration, pH and temperature were determined by using response surface methodology as 205 g/L, 6.53 and 55.1 oC, respectively. Under these circumstances, the highest HPR of 1.36 L H2/L-d was predicted. Verification tests proved the reliability of the statistical approach. As a result of the heat pretreatment and fermentation optimization, a significant (~ 4 folds increase in HPR was achieved. PCR-DGGE results revealed that Clostridium sp. were majorly present under the optimal conditions.

  16. Software reliability

    Bendell, A


    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  17. Integrated experimental and computational approach for residual stress investigation near through-silicon vias

    Deluca, Marco; Hammer, René; Keckes, Jozef; Kraft, Jochen; Schrank, Franz; Todt, Juraj; Robach, Odile; Micha, Jean-Sébastien; Defregger, Stefan


    The performance of three-dimensional integrated circuits is decisively influenced by the thermo-mechanical behavior of through-silicon vias (TSVs), which are subjected to stresses formed during fabrication process as well as cyclic operation as a result of coefficients of thermal expansion (CTEs) mismatch between the silicon substrate, passivation layers, and metallic conduction paths. In this work, we adopted an integrated approach combining micro-Raman, wafer curvature experiments, and finite element (FE) modeling to study the triaxial residual stresses in silicon in the vicinity of W-coated hollow TSVs. A comparison of the experimental and calculated Raman shifts from a TSV cross section allowed a validation of the FE model, which was then extended to a non-sliced TSV. In the next step, the calculated bulk strains were compared with the ones measured using synchrotron X-ray micro-diffraction in order to specifically assess the stress decrease in Si as a function of the distance from the TSV wall within ˜25 μm. The experimental verification of the FE model demonstrates the importance of combined experimental-computational approaches to study stresses in micro-scale devices with complex morphology.

  18. Mechanistic insight into degradation of endocrine disrupting chemical by hydroxyl radical: An experimental and theoretical approach.

    Xiao, Ruiyang; Gao, Lingwei; Wei, Zongsu; Spinney, Richard; Luo, Shuang; Wang, Donghong; Dionysiou, Dionysios D; Tang, Chong-Jian; Yang, Weichun


    Advanced oxidation processes (AOPs) based on formation of free radicals at ambient temperature and pressure are effective for treating endocrine disrupting chemicals (EDCs) in waters. In this study, we systematically investigated the degradation kinetics of bisphenol A (BPA), a representative EDC by hydroxyl radical (OH) with a combination of experimental and theoretical approaches. The second-order rate constant (k) of BPA with OH was experimentally determined to be 7.2 ± 0.34 × 10(9) M(-1) s(-1) at pH 7.55. We also calculated the thermodynamic and kinetic behaviors for the bimolecular reactions by density functional theory (DFT) using the M05-2X method with 6-311++G** basis set and solvation model based on density (SMD). The results revealed that H-abstraction on the phenol group is the most favorable pathway for OH. The theoretical k value corrected by the Collins-Kimball approach was determined to be 1.03 × 10(10) M(-1) s(-1), which is in reasonable agreement with the experimental observation. These results are of fundamental and practical importance in understanding the chemical interactions between OH and BPA, and aid further AOPs design in treating EDCs during wastewater treatment processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A Sustainable, Reliable Mission-Systems Architecture that Supports a System of Systems Approach to Space Exploration

    Watson, Steve; Orr, Jim; O'Neil, Graham


    A mission-systems architecture based on a highly modular "systems of systems" infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is absolutely essential for an affordable and sustainable space exploration program. This architecture requires (a) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimum sustaining engineering. This paper proposes such an architecture. Lessons learned from the space shuttle program are applied to help define and refine the model.

  20. Stability of vaccines - bridging from stability data to continuous safety and efficacy throughout shelf life - an always reliable approach?

    Pfleiderer, Michael


    Stability studies are important tools to reliably ensure that efficacy and safety of medicinal products will remain unchanged from release of drug product until the end of shelf life. For complex medicinal products such as biological medicinal products, including vaccines, design and conduct of such studies requires particularly careful considerations in order to ensure that technical data resulting from stability studies are indeed indicative for unchanged clinical performance. Ideally, relevance of specifications controlled by stability studies as well as definition of shelf life should be justified by acceptable clinical data obtained with product at the end of the shelf life claimed.

  1. A Probabilistic Approach of Incorporating Safety and Reliability in System Designs for a Manned Mission to Mars

    Railsback, Jan W.; Simion, George P.; Himel, Malcolm (Technical Monitor)


    Conceptual stages in mission design often lack the input of quantitative safety and reliability assessments, simply because failure rates or other data are not yet available for systems that have not yet been designed. Absence of such data should not, however, prevent the development of a quantitative risk models with placeholders for missing data. Functions (that is, actions the systems must perform) in mission design will eventually require system probabilities of success, and there could be much learned from surrogate data, adequately bounded in uncertainty, used in a large event tree model of a complex mission.

  2. Two-Photon Polarization Dependent Spectroscopy in Chirality: A Novel Experimental-Theoretical Approach to Study Optically Active Systems

    Florencio E. Hernández


    Full Text Available Many phenomena, including life itself and its biochemical foundations are fundamentally rooted in chirality. Combinatorial methodologies for catalyst discovery and optimization remain an invaluable tool for gaining access to enantiomerically pure compounds in the development of pharmaceuticals, agrochemicals, and flavors. Some exotic metamaterials exhibiting negative refractive index at optical frequencies are based on chiral structures. Chiroptical activity is commonly quantified in terms of circular dichroism (CD and optical rotatory dispersion (ORD. However, the linear nature of these effects limits their application in the far and near-UV region in highly absorbing and scattering biological systems. In order to surmount this barrier, in recent years we made important advancements on a novel non linear, low-scatter, long-wavelength CD approach called two-photon absorption circular dichroism (TPACD. Herein we present a descriptive analysis of the optics principles behind the experimental measurement of TPACD, i.e., the double L-scan technique, and its significance using pulsed lasers. We also make an instructive examination and discuss the reliability of our theoretical-computational approach, which uses modern analytical response theory, within a Time-Dependent Density Functional Theory (TD-DFT approach. In order to illustrate the potential of this novel spectroscopic tool, we first present the experimental and theoretical results obtained in C2-symmetric, axially chiral R-(+-1,1'-bi(2-naphthol, R-BINOL, a molecule studied at the beginning of our investigation in this field. Next, we reveal some preliminary results obtained for (R-3,3′-diphenyl-2,2′-bi-1-naphthol, R-VANOL, and (R-2,2′-diphenyl-3,3′-(4-biphenanthrol, R-VAPOL. This family of optically active compounds has been proven to be a suitable model for the structure-property relationship study of TPACD, because its members are highly conjugated yet photo-stable, and easily

  3. Two-photon polarization dependent spectroscopy in chirality: a novel experimental-theoretical approach to study optically active systems.

    Hernández, Florencio E; Rizzo, Antonio


    Many phenomena, including life itself and its biochemical foundations are fundamentally rooted in chirality. Combinatorial methodologies for catalyst discovery and optimization remain an invaluable tool for gaining access to enantiomerically pure compounds in the development of pharmaceuticals, agrochemicals, and flavors. Some exotic metamaterials exhibiting negative refractive index at optical frequencies are based on chiral structures. Chiroptical activity is commonly quantified in terms of circular dichroism (CD) and optical rotatory dispersion (ORD). However, the linear nature of these effects limits their application in the far and near-UV region in highly absorbing and scattering biological systems. In order to surmount this barrier, in recent years we made important advancements on a novel non linear, low-scatter, long-wavelength CD approach called two-photon absorption circular dichroism (TPACD). Herein we present a descriptive analysis of the optics principles behind the experimental measurement of TPACD, i.e., the double L-scan technique, and its significance using pulsed lasers. We also make an instructive examination and discuss the reliability of our theoretical-computational approach, which uses modern analytical response theory, within a Time-Dependent Density Functional Theory (TD-DFT) approach. In order to illustrate the potential of this novel spectroscopic tool, we first present the experimental and theoretical results obtained in C(2)-symmetric, axially chiral R-(+)-1,1'-bi(2-naphthol), R-BINOL, a molecule studied at the beginning of our investigation in this field. Next, we reveal some preliminary results obtained for (R)-3,3'-diphenyl-2,2'-bi-1-naphthol, R-VANOL, and (R)-2,2'-diphenyl-3,3'-(4-biphenanthrol), R-VAPOL. This family of optically active compounds has been proven to be a suitable model for the structure-property relationship study of TPACD, because its members are highly conjugated yet photo-stable, and easily derivatized at the 5

  4. [Artificial Inversion of the Left-Right Visceral Asymmetry in Vertebrates: Conceptual Approaches and Experimental Solutions].

    Truleva, A S; Malashichev, E B; Ermakov, A S


    Externally, vertebrates are bilaterally symmetrical; however, left-right asymmetry is observed in the structure of their internal organs and systems of organs (circulatory, digestive, and respiratory). In addition to the asymmetry of internal organs (visceral), there is also functional (i.e., asymmetrical functioning of organs on the left and right sides of the body) and behavioral asymmetry. The question of a possible association between different types of asymmetry is still open. The study of the mechanisms of such association, in addition to the fundamental interest, has important applications for biomedicine, primarily for the understanding of the brain functioning in health and disease and for the development of methods of treatment of certain mental diseases, such as schizophrenia and autism, for which the disturbance of left-right asymmetry of the brain was shown. To study the deep association between different types of asymmetry, it is necessary to obtain adequate animal models (primarily animals with inverted visceral organs, situs inversus totalis). There are two main possible approaches to obtaining such model organisms: mutagenesis followed by selection of mutant strains with mutations in the genes that affect the formation of the left-right visceral asymmetry and experimental obtaining of animals with inverted internal organs. This review focuses on the second approach. We describe the theoretical models for establishing left-right asymmetry and possible experimental approaches to obtaining animals with inverted internal organs.

  5. Experimental approaches for the development of gamma spectroscopy well logging system

    Shin, Jehyun; Hwang, Seho; Kim, Jongman [Korea Institute of Geoscience and Mineral Resources (124 Gwahang-no, Yuseong-gu, Daejeon, Korea) (Korea, Republic of); Won, Byeongho [Heesong Geotek Co., Ltd (146-8 Sangdaewon-dong, Jungwon-gu, Seongnam-si, Gyeonggi-do, Korea) (Korea, Republic of)


    This article discusses experimental approaches for the development of gamma spectroscopy well logging system. Considering the size of borehole sonde, we customize 2 x 2 inches inorganic scintillators and the system including high voltage, preamplifier, amplifier and multichannel analyzer (MCA). The calibration chart is made by test using standard radioactive sources so that the measured count rates are expressed by energy spectrum. Optimum high-voltage supplies and the measurement parameters of each detector are set up by experimental investigation. Also, the responses of scintillation detectors have been examined by analysis according to the distance between source and detector. Because gamma spectroscopy well logging needs broad spectrum, high sensitivity and resolution, the energy resolution and sensitivity as a function of gamma ray energy are investigated by analyzing the gamma ray activities of the radioactive sources.

  6. Experimental approach and techniques for the evaluation of wet flue gas desulfurization scrubber fluid mechanics

    Strock, T.W. [Babcock and Wilcox Co., Alliance, OH (United States). Research and Development Div.; Gohara, W.F. [Babcock and Wilcox Co., Barberton, OH (United States)


    The fluid mechanics within wet flue desulfurization (FGD) scrubbers involve several complex two-phase gas/liquid interactions. The fluid flow directly affects scrubber pressure drop, mist eliminator water removal, and the SO{sub 2} mass transfer/chemical reaction process. Current industrial efforts to develop cost-effective high-efficiency wet FGD scrubbers are focusing, in part, on optimizing the fluid mechanics. The development of an experimental approach and test facility for understanding and optimizing wet scrubber flow characteristics is discussed in this paper. Specifically, scaling procedures for downsizing a wet scrubber for the laboratory environment with field data comparisons are summarized. Furthermore, experimental techniques for the measurement of wet scrubber flow distribution, pressure drop, spray nozzle droplet size characteristics and wet scrubber liquid-to-gas ratio are discussed. Finally, the characteristics and capabilities of a new hydraulic test facility for wet FGD scrubbers are presented. (author)

  7. Paul Baillon presents the book "Differential manifolds: a basic approach for experimental physicists" | 25 March

    CERN Library


    Tuesday 25 March 2014 at 4 p.m. in the Library, bldg. 52-1-052 "Differential manifolds: a basic approach for experimental physicists" by Paul Baillon,  World Scientific, 2013, ISBN 978-981-4449-56-4. Differential manifold is the framework of particle physics and astrophysics nowadays. It is important for all research physicists to be accustomed to it, and even experimental physicists should be able to manipulate equations and expressions in this framework. This book gives a comprehensive description of the basics of differential manifold with a full proof of elements. A large part of the book is devoted to the basic mathematical concepts, which are all necessary for the development of the differential manifold. This book is self-consistent; it starts from first principles. The mathematical framework is the set theory with its axioms and its formal logic. No special knowledge is needed. Coffee will be served from 3.30 p.m.

  8. Approach of internetware optimization meeting reliability expectation%一种满足可靠度期望的网构软件优化方法

    马华; 张红宇


    The development of internetware is a process of component composition in internet environment, so the traditional reliability analysis technology can not be directly applied to internetware. This paper defined the abstract model and physical model of internetware, and analyzed the measure methods about five type of component composition structure on the basis of formal definition of reliability. Put forward an approach of internetware optimization meeting the reliability expectation. In this approach, transformed complex internetware into serial composition structure, and proposed an improved ant colony algorithm to achieve the mapping of abstract model and physical model of internetware. This algorithm employed the reliability expectation threshold of abstract model and physical model to filter invalid branch, and selected better path based on the reliability expectation of internetware. Finally, the experiment proved that this method is feasible and effective to solve reliability optimization problem of internetware in open environment, and has better performance than traditional method.%Internet环境下网构软件的开发是一个构件组装过程,传统的软件可靠性技术在其应用过程中面临着新的挑战.建立了网构软件的抽象模型和物理模型,介绍了可靠性的形式化定义及五种构件组装结构的可靠性度量方法,提出了一种满足可靠度期望的网构软件优化方法.该方法将复杂的网构软件结构进行串行化,以抽象构件、链路的可靠度期望阈值过滤无效分支,以网构软件的可靠性期望筛选较优路径,应用一种改进的蚁群优化算法求解,从而实现了网构软件抽象模型与物理模型的映射.实验仿真及分析表明,该方法适用于解决开放式环境下网构软件的可靠性优化问题,相对于传统方法,它具备较好的执行性能.

  9. Interpreting People Interpreting Things: A Heideggerian Approach to ‘Experimental Reconstruction’

    Steve Townend


    Full Text Available This paper represents some preliminary thoughts on what one area of experimental archaeology might begin to look like if approached through the early philosophy of Martin Heidegger. The broader remit of this research seeks to re-draw experimental archaeology as a practice that is understood for its ‘interpretative’ character rather than as narrowly ‘scientific’ as conventionally portrayed. The specific subject of this paper is a development of Heidegger’s notion of ‘skilled coping’ and the relationship between people and things in the context of the physical reconstruction of the later prehistoric roundhouse in Britain. In this paper I will argue that understandings of the reconstruction and construction of the later prehistoric roundhouse may be significantly enhanced by examining them in relation to a series of phenomena interpreted from the early work of Martin Heidegger. This perspective is intended to re-conceptualise the way in which reconstruction as an exercise is theorised by centring such projects on their human element. It gives practitioners a range of phenomena to consider or include in their research aims and projects that are other to the normal considerations of technology, material constraints, etc. In so doing it will be possible to counter some of the failings of experimental archaeology. This approach is seen as an augmentation to current theory and practice. It aims to make a broader contribution to the theory, practice and role of other ‘field-based’ or replicative experiments and to understandings of a human element that has been largely unexplored within experimental archaeology.

  10. Measurement System Reliability Assessment

    Kłos Ryszard


    Full Text Available Decision-making in problem situations is based on up-to-date and reliable information. A great deal of information is subject to rapid changes, hence it may be outdated or manipulated and enforce erroneous decisions. It is crucial to have the possibility to assess the obtained information. In order to ensure its reliability it is best to obtain it with an own measurement process. In such a case, conducting assessment of measurement system reliability seems to be crucial. The article describes general approach to assessing reliability of measurement systems.

  11. Reliable knowledge discovery

    Dai, Honghua; Smirnov, Evgueni


    Reliable Knowledge Discovery focuses on theory, methods, and techniques for RKDD, a new sub-field of KDD. It studies the theory and methods to assure the reliability and trustworthiness of discovered knowledge and to maintain the stability and consistency of knowledge discovery processes. RKDD has a broad spectrum of applications, especially in critical domains like medicine, finance, and military. Reliable Knowledge Discovery also presents methods and techniques for designing robust knowledge-discovery processes. Approaches to assessing the reliability of the discovered knowledge are introduc

  12. Experimental approaches to study plant cell walls during plant-microbe interactions.

    Xia, Ye; Petti, Carloalberto; Williams, Mark A; DeBolt, Seth


    Plant cell walls provide physical strength, regulate the passage of bio-molecules, and act as the first barrier of defense against biotic and abiotic stress. In addition to providing structural integrity, plant cell walls serve an important function in connecting cells to their extracellular environment by sensing and transducing signals to activate cellular responses, such as those that occur during pathogen infection. This mini review will summarize current experimental approaches used to study cell wall functions during plant-pathogen interactions. Focus will be paid to cell imaging, spectroscopic analyses, and metabolic profiling techniques.

  13. Experimental approaches to study plant cell walls during plant-microbe interactions

    Ye eXia


    Full Text Available Plant cell walls provide physical strength, regulate the passage of bio-molecules, and act as the first barrier of defense against biotic and abiotic stress. In addition to providing structural integrity, plant cell walls serve an important function in connecting cells to their extracellular environment by sensing and transducing signals to activate cellular responses, such as those that occur during pathogen infection. This mini review will summarize current experimental approaches used to study cell wall functions during plant-pathogen interactions. Focus will be paid to cell imaging, spectroscopic analyses, and metabolic profiling techniques

  14. ANN Approach for State Estimation of Hybrid Systems and Its Experimental Validation

    Shijoh Vellayikot


    Full Text Available A novel artificial neural network based state estimator has been proposed to ensure the robustness in the state estimation of autonomous switching hybrid systems under various uncertainties. Taking the autonomous switching three-tank system as benchmark hybrid model working under various additive and multiplicative uncertainties such as process noise, measurement error, process–model parameter variation, initial state mismatch, and hand valve faults, real-time performance evaluation by the comparison of it with other state estimators such as extended Kalman filter and unscented Kalman Filter was carried out. The experimental results reported with the proposed approach show considerable improvement in the robustness in performance under the considered uncertainties.

  15. An alternative experimental approach for subcritical configurations of the IPEN/MB-01 nuclear reactor

    Gonnelli, E.; Lee, S. M.; Pinto, L. N.; Landim, H. R.; Diniz, R.; Jerez, R.; dos Santos, A.


    This work presents an alternative approach for the reactivity worth experiments analysis in the IPEN/MB-01 reactor considering highly subcritical arrays. In order to reach the subcritical levels, the removal of a specific number of fuel rods is proposed. Twenty three configurations were carried out for this purpose. The control bank insertion experiment was used only as reference for the fuel rod experiment and, in addition, the control banks were maintained completely withdrawn during all the fuel rods experiment. The theoretical simulation results using the MCNP5 code and the ENDF/B-VII.0 library neutron data are in a very good agreement to experimental results.

  16. An experimental approach to validating a theory of human error in complex systems

    Morris, N. M.; Rouse, W. B.


    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  17. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Malinowski Paweł


    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  18. Experimental approaches for identification of indigenous lactococci isolated from traditional dairy products

    Tomislav Pogačić


    Full Text Available Indigenous lactic acid bacteria contribute to the taste and flavour of traditional dairy products. Therefore, the traditional dairy products might be an interesting reservoir of indigenous lactococcal strains responsible for development of the specific flavour compounds. Consequently, characterized indigenous isolates might be used as a starter culture. The development of molecular techniques provides a new perspective for characterization of the “new lactococcal” strains. However, there is no unique approach suggested for molecular characterization of the indigenous strains associated with the traditional products. The aim of this review is to provide an insight into varieties of experimental approaches applied for molecular characterization of indigenous lactococci associated with traditional dairy products.

  19. Multiplicity of experimental approaches to therapy for genetic muscle diseases and necessity for population screening.

    Laing, Nigel G


    Currently a multiplicity of experimental approaches to therapy for genetic muscle diseases is being investigated. These include replacement of the missing gene, manipulation of the gene message, repair of the mutation, upregulation of an alternative gene and pharmacological interventions targeting a number of systems. A number of these approaches are in current clinical trials. There is considerable anticipation that perhaps more than one of the approaches will finally prove of clinical benefit, but there are many voices of caution. No matter which approaches might ultimately prove effective, there is a consensus that for most benefit to the patients it will be necessary to start treatment as early as possible. A consensus is also developing that the only way to do this is to implement population-based newborn screening to identify affected children shortly after birth. Population-based newborn screening is currently practised in very few places in the world and it brings with it implications for prevention rather than cure of genetic muscle diseases.

  20. Breaking Through the Glass Ceiling: Recent Experimental Approaches to Probe the Properties of Supercooled Liquids near the Glass Transition.

    Smith, R. Scott; Kay, Bruce D.


    Experimental measurements of the properties supercooled liquids at temperatures near their respective glass transition temperatures, Tg, are requisite for understanding the behavior of glasses and amorphous solids. Unfortunately, many supercooled molecular liquids rapidly crystallize at temperatures far above their Tg making such measurements difficult to nearly impossible. In this perspective we discuss some recent alternative approaches to obtain experimental data in the temperature regime near Tg. These new approaches may yield the additional experimental data necessary to test current theoretical models of the dynamical slowdown that occurs in supercooled liquids approaching the glass transition.

  1. Breaking Through the Glass Ceiling: Recent Experimental Approaches to Probe the Properties of Supercooled Liquids near the Glass Transition.

    Smith, R Scott; Kay, Bruce D


    Experimental measurements of the properties of supercooled liquids at temperatures near their glass transition temperatures, Tg, are requisite for understanding the behavior of glasses and amorphous solids. Unfortunately, many supercooled molecular liquids rapidly crystallize at temperatures far above their Tg, making such measurements difficult to nearly impossible. In this Perspective, we discuss some recent alternative approaches to obtain experimental data in the temperature regime near Tg. These new approaches may yield the additional experimental data necessary to test current theoretical models of the dynamical slowdown that occurs in supercooled liquids approaching the glass transition.

  2. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach.

    Sergi, Pier Nicola; Jensen, Winnie; Yoshida, Ken


    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves.

  3. Novel Experimental-Modeling Approach for Characterizing Perfluorinated Surfactants in Soils.

    Courtier-Murias, Denis; Michel, Eric; Rodts, Stéphane; Lafolie, François


    Soil contamination is still poorly understood and modeled in part because of the difficulties of looking inside the "black box" constituted by soils. Here, we investigated the application of a recently developed (1)H NMR technique to (19)F NMR relaxometry experiments and utilized the results as inputs for an existing model. This novel approach yields (19)F T2 NMR relaxation values of any fluorinated contaminant, which are among the most dangerous contaminants, allowing us to noninvasively and directly monitor their fate in soils. Using this protocol, we quantified the amount of a fluorinated xenobiotic (heptafluorobutyric acid, HFBA) in three different environments in soil aggregate packings and monitored contaminant exchange dynamics between these compartments. A model computing HFBA partition dynamics between different soil compartments showed that these three environments corresponded to HFBA in solution (i) between and (ii) inside the soil aggregates and (iii) to HFBA adsorbed to (or strongly interacting with) the soil constituents. In addition to providing a straightforward way of determining the sorption kinetics of any fluorinated contaminant, this work also highlights the strengths of a combined experimental-modeling approach to unambiguously understand experimental data and more generally to study contaminant fate in soils.

  4. Dust growth in protoplanetary disks - a comprehensive experimental/theoretical approach

    Blum, Jürgen


    More than a decade of dedicated experimental work on the collisional physics of protoplanetary dust has brought us to a point at which the growth of dust aggregates can — for the first time — be self-consistently and reliably modeled. In this article, the emergent collision model for protoplanetery dust aggregates, as well as the numerical model for the evolution of dust aggregates in protoplanetary disks, is reviewed. It turns out that, after a brief period of rapid collisional growth of fluffy dust aggregates to sizes of a few centimeters, the protoplanetary dust particles are subject to bouncing collisions, in which their porosity is considerably decreased. The model results also show that low-velocity fragmentation can reduce the final mass of the dust aggregates but that it does not trigger a new growth mode as discussed previously. According to the current stage of our model, the direct formation of kilometer-sized planetesimals by collisional sticking seems unlikely, implying that collective effects, such as the streaming instability and the gravitational instability in dust-enhanced regions of the protoplanetary disk, are the best candidates for the processes leading to planetesimals.

  5. Biophysical Interactions in Porous Media: an Integrated Experimental and Modelling Approach

    Otten, W.; Baveye, P.; Falconer, R.


    A critical feature of porous media is that the geometry provides habitats of a complexity that is not seen above ground, offering shelter, food water and gasses to microorganisms, who's spatial and temporal dynamics are shaped by this microscopic heterogeneity. The microbial dynamics, including fungal and bacterial growth, in turn affect the geometry and the hydrological properties, shaping the pore architecture at short and long time scales, altering surface tension, and (partially) blocking pores, thereby affecting the flow paths of water through a structure. Crucially, the majority of these processes occur at microscopic scales with impact on larger scale ecological processes and ecosystem services. The complexity of these interacting processes and feed-back mechanisms at first may seem overwhelming. However, we show in this paper how with recent progress in experimental techniques, integrated with a modelling approach we can begin to understand the importance of microscopic heterogeneity on microbial dynamics. We will demonstrate this integrated approach for fungal dynamics in porous media such as soil through a combination of novel experimental tools and mathematical modelling. Using benchtop X-ray CT systems we present the finer detail of pore structure at scales relevant for microbial processes, and present our recent advances in the use of theoretical tools to rigorously characterise, and quantitatively describe this environment, and present the state-of-the art with respect to visualization of water in porous media. Severe challenges remain with studying the spatial distribution of microorganisms. We present how biological thin sectioning techniques offer a way forward to study the spatial distribution of fungi and bacteria within pore geometries. Through a series of detailed experiments we show how pathways, resulting from connected pore volumes and distribution of water within them, are preferentially explored during fungal invasion. We complement the

  6. Integrating experimental and analytic approaches to improve data quality in genome-wide RNAi screens.

    Zhang, Xiaohua Douglas; Espeseth, Amy S; Johnson, Eric N; Chin, Jayne; Gates, Adam; Mitnaul, Lyndon J; Marine, Shane D; Tian, Jenny; Stec, Eric M; Kunapuli, Priya; Holder, Dan J; Heyse, Joseph F; Strulovici, Berta; Ferrer, Marc


    RNA interference (RNAi) not only plays an important role in drug discovery but can also be developed directly into drugs. RNAi high-throughput screening (HTS) biotechnology allows us to conduct genome-wide RNAi research. A central challenge in genome-wide RNAi research is to integrate both experimental and computational approaches to obtain high quality RNAi HTS assays. Based on our daily practice in RNAi HTS experiments, we propose the implementation of 3 experimental and analytic processes to improve the quality of data from RNAi HTS biotechnology: (1) select effective biological controls; (2) adopt appropriate plate designs to display and/or adjust for systematic errors of measurement; and (3) use effective analytic metrics to assess data quality. The applications in 5 real RNAi HTS experiments demonstrate the effectiveness of integrating these processes to improve data quality. Due to the effectiveness in improving data quality in RNAi HTS experiments, the methods and guidelines contained in the 3 experimental and analytic processes are likely to have broad utility in genome-wide RNAi research.

  7. An experimental approach for the determination of axial and flexural wavenumbers in circular exponentially tapered bars

    Kalkowski, Michał K.; Muggleton, Jen M.; Rustighi, Emiliano


    Whilst the dynamics of tapered structures have been extensively studied numerically and analytically, very few experimental results have been presented to date. The main aim of this paper is to derive and demonstrate an experimental method enabling both axial and flexural wavenumbers in exponentially tapered bars to be estimated. Our particular interest in this type of tapering is motivated by its occurrence in naturally grown structures such as tree roots, with an outlook towards remote root mapping. Decomposing a dynamic response into a sum of contributing waves, we propose a method in which two independent wavenumbers can be calculated from five equispaced measurements. The approach was demonstrated in an experiment on a freely suspended wooden specimen supported by theoretical modelling. For axial waves we used the well-established elementary rod theory, whereas for flexural waves we build a piecewise uniform model based on the Timoshenko beam theory. The estimates calculated from the experimental data were compared with the analytical and numerical results and showed good agreement. The limitations of the method include an appropriate choice of sensor spacing, the effect of sensor misalignments and the assumption of small wavenumber variation for flexural waves.

  8. Reliable Resource Selection in Grid Environment

    Rajesh Kumar Bawa


    Full Text Available The primary concern in area of computational grid is security and resources. Most of the existing grids address this problem by authenticating the users, hosts and their interactions in an appropriate manner. A secured system is compulsory for the efficient utilization of grid services. The high degree of strangeness has been identified as the problem factors in the secured selection of grid. Without the assurance of a higher degree of trust relationship, competent resource selection and utilization cannot be achieved. In this paper we proposed an approach which is providing reliability and reputation aware security for resource selection in grid environment. In this approach, the self-protection capability and reputation weightage is utilized to obtain the Reliability Factor (RF value. Therefore jobs are allocated to the resources that posses higher RF values. Extensive experimental evaluation shows that as higher trust and reliable nodes are selected the chances of failure decreased drastically.

  9. Reliable Resource Selection in Grid Environment

    Bawa, Rajesh Kumar


    The primary concern in area of computational grid is security and resources. Most of the existing grids address this problem by authenticating the users, hosts and their interactions in an appropriate manner. A secured system is compulsory for the efficient utilization of grid services. The high degree of strangeness has been identified as the problem factors in the secured selection of grid. Without the assurance of a higher degree of trust relationship, competent resource selection and utilization cannot be achieved. In this paper we proposed an approach which is providing reliability and reputation aware security for resource selection in grid environment. In this approach, the self-protection capability and reputation weightage is utilized to obtain the Reliability Factor (RF) value. Therefore jobs are allocated to the resources that posses higher RF values. Extensive experimental evaluation shows that as higher trust and reliable nodes are selected the chances of failure decreased drastically.

  10. The Multilevel Latent Covariate Model: A New, More Reliable Approach to Group-Level Effects in Contextual Studies

    Ludtke, Oliver; Marsh, Herbert W.; Robitzsch, Alexander; Trautwein, Ulrich; Asparouhov, Tihomir; Muthen, Bengt


    In multilevel modeling (MLM), group-level (L2) characteristics are often measured by aggregating individual-level (L1) characteristics within each group so as to assess contextual effects (e.g., group-average effects of socioeconomic status, achievement, climate). Most previous applications have used a multilevel manifest covariate (MMC) approach,…

  11. Human Bone Marrow Stromal Cells: A Reliable, Challenging Tool for In Vitro Osteogenesis and Bone Tissue Engineering Approaches

    Ute Hempel


    Full Text Available Adult human bone marrow stromal cells (hBMSC are important for many scientific purposes because of their multipotency, availability, and relatively easy handling. They are frequently used to study osteogenesis in vitro. Most commonly, hBMSC are isolated from bone marrow aspirates collected in clinical routine and cultured under the “aspect plastic adherence” without any further selection. Owing to the random donor population, they show a broad heterogeneity. Here, the osteogenic differentiation potential of 531 hBMSC was analyzed. The data were supplied to correlation analysis involving donor age, gender, and body mass index. hBMSC preparations were characterized as follows: (a how many passages the osteogenic characteristics are stable in and (b the influence of supplements and culture duration on osteogenic parameters (tissue nonspecific alkaline phosphatase (TNAP, octamer binding transcription factor 4, core-binding factor alpha-1, parathyroid hormone receptor, bone gla protein, and peroxisome proliferator-activated protein γ. The results show that no strong prediction could be made from donor data to the osteogenic differentiation potential; only the ratio of induced TNAP to endogenous TNAP could be a reliable criterion. The results give evidence that hBMSC cultures are stable until passage 7 without substantial loss of differentiation potential and that established differentiation protocols lead to osteoblast-like cells but not to fully authentic osteoblasts.

  12. Optimal Configuration for Satellite PEPs using a Reliable Service on Top of a Routers-Assisted Approach

    Lopez-Pacheco, Dino Martin; Lochin, Emmanuel


    International audience; Routers-assisted congestion control protocols, also known as Explicit Rate Notification (ERN) protocols, implement complex algorithms inside a router in order to provide both high link utilization and high fairness. Thus, routers-assisted approaches overcome most of the end-to-end protocols problems in large bandwidth-delay product networks. Today, routersassisted protocols cannot be deployed in heterogeneous networks (e.g., Internet) due to their non-compliance with c...

  13. Frequency-Dependent Streaming Potential of Porous Media—Part 1: Experimental Approaches and Apparatus Design

    P. W. J. Glover


    Full Text Available Electrokinetic phenomena link fluid flow and electrical flow in porous and fractured media such that a hydraulic flow will generate an electrical current and vice versa. Such a link is likely to be extremely useful, especially in the development of the electroseismic method. However, surprisingly few experimental measurements have been carried out, particularly as a function of frequency because of their difficulty. Here we have considered six different approaches to make laboratory determinations of the frequency-dependent streaming potential coefficient. In each case, we have analyzed the mechanical, electrical, and other technical difficulties involved in each method. We conclude that the electromagnetic drive is currently the only approach that is practicable, while the piezoelectric drive may be useful for low permeability samples and at specified high frequencies. We have used the electro-magnetic drive approach to design, build, and test an apparatus for measuring the streaming potential coefficient of unconsolidated and disaggregated samples such as sands, gravels, and soils with a diameter of 25.4 mm and lengths between 50 mm and 300 mm.

  14. Three experimental approaches to measure the social context dependence of prejudice communication and discriminatory behavior.

    Beyer, Heiko; Liebe, Ulf


    Empirical research on discrimination is faced with crucial problems stemming from the specific character of its object of study. In democratic societies the communication of prejudices and other forms of discriminatory behavior is considered socially undesirable and depends on situational factors such as whether a situation is considered private or whether a discriminatory consensus can be assumed. Regular surveys thus can only offer a blurred picture of the phenomenon. But also survey experiments intended to decrease the social desirability bias (SDB) so far failed in systematically implementing situational variables. This paper introduces three experimental approaches to improve the study of discrimination and other topics of social (un-)desirability. First, we argue in favor of cognitive context framing in surveys in order to operationalize the salience of situational norms. Second, factorial surveys offer a way to take situational contexts and substitute behavior into account. And third, choice experiments - a rather new method in sociology - offer a more valid method of measuring behavioral characteristics compared to simple items in surveys. All three approaches - which may be combined - are easy to implement in large-scale surveys. Results of empirical studies demonstrate the fruitfulness of each of these approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Accelerated Wafer-Level Integrated Circuit Reliability Testing for Electromigration in Metal Interconnects with Enhanced Thermal Modeling, Structure Design, Control of Stress, and Experimental Measurements.

    Shih, Chih-Ching

    Wafer-level electromigration tests have been developed recently to fulfill the need for rapid testing in integrated circuit production facilities. We have developed an improved thermal model-TEARS (Thermal Energy Accounts for the Resistance of the System) that supports these tests. Our model is enhanced by treatments for determination of the thermal conductivity of metal, K_{m}, heat sinking effects of the voltage probes and current lead terminations, and thermoelectric power. Our TEARS analysis of multi-element SWEAT (Standard Wafer-level Electromigration Acceleration Test) structures yields design criteria for the length of current injection leads and choice of voltage probe locations to isolate test units from the heat sinking effect of current lead terminations. This also provides greater insight into the current for thermal runaway. From our TEARS model and Black's equation for lifetime prediction, we have developed an algorithm for a fast and accurate control of stress in SWEAT tests. We have developed a lookup table approach for precise electromigration characterizations without complicated calculations. It decides the peak temperature in the metal, T_ {max}, and the thermal conductivity of the insulator, K_{i}, from an experimental resistance measurement at a given current. We introduce a characteristic temperature, T _{EO}, which is much simpler to use than conventional temperature coefficient of the electrical resistivity of metal for calibration and transfer of calibration data of metallic films as their own temperature sensors. The use of T_{EO} also allows us to establish system specifications for a desirable accuracy in temperature measurement. Our experimental results are the first to show the effects of series elemental SWEAT units on the system failure distribution, spatial failure distribution in SWEAT structures, and bimodal distributions for straight-line structures. The adaptive approach of our TEARS based SWEAT test decides the value of Black

  16. Can Small Islands Protect Nearby Coasts From Tsunamis? An Active Experimental Design Approach

    Stefanakis, Themistoklis; Contal, Emile; Vayatis, Nicolas; Dias, Frédéric; Synolakis, Costas


    In recent years we have witnessed the dreadful damage tsunamis caused in coastal areas around the globe. In some of these locations, small islands in the vicinity of the mainland offer protection from wind-generated waves and thus communities were developed. But do these islands act as natural barriers to tsunamis? Recent post-tsunami survey data reveal that in certain cases the run-up in coastal areas behind small offshore islands was significantly higher than in neighboring locations. To study the conditions of this run-up amplification, we solve numerically the nonlinear shallow water equations. We use the simplified geometry of a conical island sitting on a flat bed in front of a uniform sloping beach. Hence, the experimental setup is controlled by five physical parameters, namely the island slope, the beach slope, the water depth, the distance between the island and the plane beach and the incoming wavelength, while the wave height was kept fixed. An active experimental design approach was adopted in order to find with the least number of simulations the maximum run-up amplification on the area of the beach behind the island with respect to a lateral location on the beach, not directly affected by the presence of the island. For this purpose, a statistical emulator was built to guide the selection of the query points in the input space and a stopping criterion was used to signal when no further simulations were needed. We have found that in all cases explored, the run-up amplification was larger than unity and in certain occasions reached up to 70% increase. The presence of the island delays the run-up of the wave on the plane beach behind it, while edge waves generated by the run-up in lateral locations on the beach converge towards the center. The synchronous arrival of the three waves (2 edge waves and tsunami from the lee side of the island) is responsible for the run-up amplification in these areas. The use of the active experimental design approach can

  17. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    H. C. Winsemius


    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  18. Reinforced concrete structures loaded by snow avalanches : numerical and experimental approaches.

    Ousset, I.; Bertrand, D.; Brun, M.; Limam, A.; Naaim, M.


    Today, due to the extension of occupied areas in mountainous regions, new strategies for risk mitigation have to be developed. In the framework of risk analysis, these latter have to take into account not only the natural hazard description but also the physical vulnerability of the exposed structures. From a civil engineering point of view, the dynamic behavior of column or portico was widely investigated especially in the case of reinforced concrete and steel. However, it is not the case of reinforced concrete walls for which only the in-plan dynamic behavior (shear behavior) has been studied in detail in the field of earthquake engineering. Therefore, the aim of this project is to study the behavior of reinforced concrete civil engineering structures submitted to out-of-plan dynamic loadings coming from snow avalanche interaction. Numerical simulations in 2D or 3D by the finite element method (FEM) are presented. The approach allows solving mechanical problems in dynamic condition involving none linearities (especially none linear materials). Thus, the structure mechanical response can be explored in controlled conditions. First, a reinforced concrete wall with a L-like shape is considered. The structure is supposed to represent a French defense structure dedicated to protect people against snow avalanches. Experimental pushover tests have been performed on a physical model. The experimental tests consisted to apply a uniform distribution of pressure until the total collapse of the wall. A 2D numerical model has been developed to simulate the mechanical response of the structure under quasi-static loading. Numerical simulations have been compared to experimental datas and results gave a better understanding of the failure mode of the wall. Moreover, the influence of several parameters (geometry and the mechanical properties) is also presented. Secondly, punching shear experimental tests have also been carried out. Reinforced concrete slabs simply supported have



    Collaborative filtering recommender systems often suffer from the "Matchmaker" problem, which comes from the false assumption that users are counted only based on their similarity, and high similarity means good advisers. In order to find good advisers for every user, a matchmaker's reliability mode based on the algorithm deriving from Hits is constructed, and it is applied in the proposed World Wide Web (WWW)collaborative recommendation system. Comparative experimental results also show that our approach obviously improves the substantial performance.

  20. An experimental design approach to the chemical characterisation of pectin polysaccharides extracted from Cucumis melo Inodorus.

    Denman, Laura J; Morris, Gordon A


    Extracted pectins have been utilised in a number of applications in both the food and pharmaceutical industries where they are generally used as gelling agents, thickeners and stabilisers, although a number of pectins have been shown to be bioactive. These functional properties will depend upon extraction conditions. A statistical experimental design approach was used to study the effects of extraction conditions pH, time and temperature on pectins extracted from Cucumis melo Inodorus. The results show that the chemical composition is very sensitive to these conditions and that this has a great influence on for example the degree of branching. Higher temperatures, lower pHs and longer extraction times lead to a loss of the more acid labile arabinofuranose residues present on the pectin side chain. The fitting of regression equations relating yield and composition to extraction conditions can therefore lead to tailor-made pectins for specific properties and/or applications.

  1. Experimental approaches to studying the nature and impact of splicing variation in zebrafish.

    Keightley, M C; Markmiller, S; Love, C G; Rasko, J E J; Lieschke, G J; Heath, J K


    From a fixed number of genes carried in all cells, organisms create considerable diversity in cellular phenotype through differential regulation of gene expression. One prevalent source of transcriptome diversity is alternative pre-mRNA splicing, which is manifested in many different forms. Zebrafish models of splicing dysfunction due to mutated spliceosome components provide opportunity to link biochemical analyses of spliceosome structure and function with whole organism phenotypic outcomes. Drawing from experience with two zebrafish mutants: cephalophŏnus (a prpf8 mutant, isolated for defects in granulopoiesis) and caliban (a rnpc3 mutant, isolated for defects in digestive organ development), we describe the use of glycerol gradient sedimentation and native gel electrophoresis to resolve components of aberrant splicing complexes. We also describe how RNAseq can be employed to examine relatively rare alternative splicing events including intron retention. Such experimental approaches in zebrafish can promote understanding of how splicing variation and dysfunction contribute to phenotypic diversity and disease pathogenesis.

  2. A Bayesian approach to quantifying uncertainty from experimental noise in DEER spectroscopy

    Edwards, Thomas H.; Stoll, Stefan


    Double Electron-Electron Resonance (DEER) spectroscopy is a solid-state pulse Electron Paramagnetic Resonance (EPR) experiment that measures distances between unpaired electrons, most commonly between protein-bound spin labels separated by 1.5-8 nm. From the experimental data, a distance distribution P (r) is extracted using Tikhonov regularization. The disadvantage of this method is that it does not directly provide error bars for the resulting P (r) , rendering correct interpretation difficult. Here we introduce a Bayesian statistical approach that quantifies uncertainty in P (r) arising from noise and numerical regularization. This method provides credible intervals (error bars) of P (r) at each r . This allows practitioners to answer whether or not small features are significant, whether or not apparent shoulders are significant, and whether or not two distance distributions are significantly different from each other. In addition, the method quantifies uncertainty in the regularization parameter.

  3. Approaches to enhance the teaching quality of experimental biochemistry for MBBS students in TSMU, China.

    Yu, Lijuan; Yi, Shuying; Zhai, Jing; Wang, Zhaojin


    With the internationalization of medical education in China, the importance of international students' education in medical schools is also increasing. Except foreign students majoring in Chinese language, English Bachelor of Medicine, Bachelor of Surgery (MBSS) students are the largest group of international students. Based on problems in the teaching process for experimental biochemistry, we designed teaching models adapted to the background of international students and strengthened teachers' teaching ability at Taishan Medical University. Several approaches were used in combination to promote teaching effects and increase the benefit of teaching to teachers. The primary data showed an increased passion for basic medical biochemistry and an improved theoretical background for MBSS students, which will be helpful for their later clinical medicine studies. © 2017 by The International Union of Biochemistry and Molecular Biology, 2017.

  4. An Experimental Approach for the Identification of Conserved Secreted Proteins in Trypanosomatids

    Rosa M. Corrales


    Full Text Available Extracellular factors produced by Leishmania spp., Trypanosoma cruzi, and Trypanosoma brucei are important in the host-parasite relationship. Here, we describe a genome-based approach to identify putative extracellular proteins conserved among trypanosomatids that are likely involved in the classical secretory pathway. Potentially secreted proteins were identified by bioinformatic analysis of the T. cruzi genome. A subset of thirteen genes encoding unknown proteins with orthologs containing a signal peptide sequence in L. infantum, L. major, and T. brucei were transfected into L. infantum. Tagged proteins detected in the extracellular medium confirmed computer predictions in about 25% of the hits. Secretion was confirmed for two L. infantum orthologs proteins using the same experimental system. Infectivity studies of transgenic Leishmania parasites suggest that one of the secreted proteins increases parasite replication inside macrophages. This methodology can identify conserved secreted proteins involved in the classical secretory pathway, and they may represent potential virulence factors in trypanosomatids.

  5. Experimentation and Optimization of Surface Roughness in WEDM Process using Full Factorial Design integrated PCA Approach

    Rismaya Kumar Mishra


    Full Text Available Application of WEDM is growing rapidly since the last three decades due its several advantages and applicability of the process to produce complicated intrinsic, extrinsic shapes of miniaturized size, so there is a need to analyze and optimize the process. In this research work the experiments were conducted using the general full factorial design methodology with 48 experimental runs. The values response parameters Ra, Rq and Rz were measured and the effect of process parameters wire type, wire tension, power, pulse on time and discharge current on these responses were studied qualitatively and quantitatively using main effect plots, interaction plots and ANOVA. Finally the optimal process parameter setting for responses were found by using full factorial design integrated PCA Approach.

  6. Using a Web-Based Approach to Assess Test-Retest Reliability of the "Hypertension Self-Care Profile" Tool in an Asian Population: A Validation Study.

    Koh, Yi Ling Eileen; Lua, Yi Hui Adela; Hong, Liyue; Bong, Huey Shin Shirley; Yeo, Ling Sui Jocelyn; Tsang, Li Ping Marianne; Ong, Kai Zhi; Wong, Sook Wai Samantha; Tan, Ngiap Chuan


    Essential hypertension often requires affected patients to self-manage their condition most of the time. Besides seeking regular medical review of their life-long condition to detect vascular complications, patients have to maintain healthy lifestyles in between physician consultations via diet and physical activity, and to take their medications according to their prescriptions. Their self-management ability is influenced by their self-efficacy capacity, which can be assessed using questionnaire-based tools. The "Hypertension Self-Care Profile" (HTN-SCP) is 1 such questionnaire assessing self-efficacy in the domains of "behavior," "motivation," and "self-efficacy." This study aims to determine the test-retest reliability of HTN-SCP in an English-literate Asian population using a web-based approach. Multiethnic Asian patients, aged 40 years and older, with essential hypertension were recruited from a typical public primary care clinic in Singapore. The investigators guided the patients to fill up the web-based 60-item HTN-SCP in English using a tablet or smartphone on the first visit and refilled the instrument 2 weeks later in the retest. Internal consistency and test-retest reliability were evaluated using Cronbach's Alpha and intraclass correlation coefficients (ICC), respectively. The t test was used to determine the relationship between the overall HTN-SCP scores of the patients and their self-reported self-management activities. A total of 160 patients completed the HTN-SCP during the initial test, from which 71 test-retest responses were completed. No floor or ceiling effect was found for the scores for the 3 subscales. Cronbach's Alpha coefficients were 0.857, 0.948, and 0.931 for "behavior," "motivation," and "self-efficacy" domains respectively, indicating high internal consistency. The item-total correlation ranges for the 3 scales were from 0.105 to 0.656 for Behavior, 0.401 to 0.808 for Motivation, 0.349 to 0.789 for Self-efficacy. The corresponding

  7. Experimental and statistical approaches in method cross-validation to support pharmacokinetic decisions.

    Thway, Theingi M; Ma, Mark; Lee, Jean; Sloey, Bethlyn; Yu, Steven; Wang, Yow-Ming C; Desilva, Binodh; Graves, Tom


    A case study of experimental and statistical approaches for cross-validating and examining the equivalence of two ligand binding assay (LBA) methods that were employed in pharmacokinetic (PK) studies is presented. The impact of changes in methodology based on the intended use of the methods was assessed. The cross-validation processes included an experimental plan, sample size selection, and statistical analysis with a predefined criterion of method equivalence. The two methods were deemed equivalent if the ratio of mean concentration fell within the 90% confidence interval (0.80-1.25). Statistical consideration of method imprecision was used to choose the number of incurred samples (collected from study animals) and conformance samples (spiked controls) for equivalence tests. The difference of log-transformed mean concentration and the 90% confidence interval for two methods were computed using analysis of variance. The mean concentration ratios of the two methods for the incurred and spiked conformance samples were 1.63 and 1.57, respectively. The 90% confidence limit was 1.55-1.72 for the incurred samples and 1.54-1.60 for the spiked conformance samples; therefore, the 90% confidence interval was not contained within the (0.80-1.25) equivalence interval. When the PK parameters of two studies using each of these two methods were compared, we determined that the therapeutic exposure, AUC((0-168)) and C(max), from Study A/Method 1 was approximately twice that of Study B/Method 2. We concluded that the two methods were not statistically equivalent and that the magnitude of the difference was reflected in the PK parameters in the studies using each method. This paper demonstrates the need for method cross-validation whenever there is a switch in bioanalytical methods, statistical approaches in designing the cross-validation experiments and assessing results, or interpretation of the impact of PK data.


    Goodarz Ahmadi


    In this project, a computational modeling approach for analyzing flow and ash transport and deposition in filter vessels was developed. An Eulerian-Lagrangian formulation for studying hot-gas filtration process was established. The approach uses an Eulerian analysis of gas flows in the filter vessel, and makes use of the Lagrangian trajectory analysis for the particle transport and deposition. Particular attention was given to the Siemens-Westinghouse filter vessel at Power System Development Facility in Wilsonville in Alabama. Details of hot-gas flow in this tangential flow filter vessel are evaluated. The simulation results show that the rapidly rotation flow in the spacing between the shroud and the vessel refractory acts as cyclone that leads to the removal of a large fraction of the larger particles from the gas stream. Several alternate designs for the filter vessel are considered. These include a vessel with a short shroud, a filter vessel with no shroud and a vessel with a deflector plate. The hot-gas flow and particle transport and deposition in various vessels are evaluated. The deposition patterns in various vessels are compared. It is shown that certain filter vessel designs allow for the large particles to remain suspended in the gas stream and to deposit on the filters. The presence of the larger particles in the filter cake leads to lower mechanical strength thus allowing for the back-pulse process to more easily remove the filter cake. A laboratory-scale filter vessel for testing the cold flow condition was designed and fabricated. A laser-based flow visualization technique is used and the gas flow condition in the laboratory-scale vessel was experimental studied. A computer model for the experimental vessel was also developed and the gas flow and particle transport patterns are evaluated.

  9. Experimental Validation of Modeled Fe Opacities at Conditions Approaching the Base of the Solar Convection Zone

    Nagayama, Taisuke


    Knowledge of the Sun is a foundation for other stars. However, after the solar abundance revision in 2005, standard solar models disagree with helioseismic measurements particularly at the solar convection zone base (CZB, r ~ 0 . 7 ×RSun) [Basu, et al., Physics Reports 457, 217 (2008)]. One possible explanation is an underestimate in the Fe opacity at the CZB [Bailey et al., Phys. Plasmas 16, 058101 (2009)]. Modeled opacities are important physics inputs for plasma simulations (e.g. standard solar models). However, modeled opacities are not experimentally validated at high temperatures because of three challenging criteria required for reliable opacity measurements: 1) smooth and strong backlighter, 2) plasma condition uniformity, and 3) simultaneous measurements of plasma condition and transmission. Fe opacity experiments are performed at the Sandia National Laboratories (SNL) Z-machine aiming at conditions close to those at the CZB (i.e. Te = 190 eV, ne = 1 ×1023 cm-3). To verify the quality of the experiments, it is critical to investigate how well the three requirements are satisfied. The smooth and strong backlighter is provided by the SNL Z-pinch dynamic hohlraum. Fe plasma condition is measured by mixing Mg into the Fe sample and employing Mg K-shell line transmission spectroscopy. Also, an experiment is designed and performed to measure the level of non-uniformity in the Fe plasma by mixing Al and Mg dopants on the opposite side of the Fe sample and analyzing their spectra. We will present quantitative results on these investigations as well as the comparison of the measured opacity to modeled opacities. Sandia is a multi-program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract DE-AC04-94AL85000.

  10. Investigation of sonar transponders for offshore wind farms: modeling approach, experimental setup, and results.

    Fricke, Moritz B; Rolfes, Raimund


    The installation of offshore wind farms in the German Exclusive Economic Zone requires the deployment of sonar transponders to prevent collisions with submarines. The general requirements for these systems have been previously worked out by the Research Department for Underwater Acoustics and Marine Geophysics of the Bundeswehr. In this article, the major results of the research project "Investigation of Sonar Transponders for Offshore Wind Farms" are presented. For theoretical investigations a hybrid approach was implemented using the boundary element method to calculate the source directivity and a three-dimensional ray-tracing algorithm to estimate the transmission loss. The angle-dependence of the sound field as well as the weather-dependence of the transmission loss are compared to experimental results gathered at the offshore wind farm alpha ventus, located 45 km north of the island Borkum. While theoretical and experimental results are in general agreement, the implemented model slightly underestimates scattering at the rough sea surface. It is found that the source level of 200 dB re 1 μPa at 1 m is adequate to satisfy the detectability of the warning sequence at distances up to 2 NM (≈3.7 km) within a horizontal sector of ±60° if realistic assumptions about signal-processing and noise are made. An arrangement to enlarge the angular coverage is discussed.

  11. An interdisciplinary and experimental approach applied to an analysis of the communication of influence

    Brigitte JUANALS


    Full Text Available This paper describes the added value of an interdisciplinary and experimental approach applied to an analysis of the inter-organizational communication of influence. The field analyzed is the international industrial standardization of societal security. A communicational problem has been investigated with an experimental method based on natural language processing and knowledge management tools. The purpose of the methodological framework is to clarify the way international standards are designed and the policies that are supported by these standards. Furthermore, strategies of influence of public and private stakeholders involved in the NGOs which produce these texts have also been studied. The means of inter-organizational communication between organizations (companies or governmental authorities and NGOs can be compared to the lobbying developed in the context of the construction of Europe and globalization. Understanding the prescriptive process has become a crucial issue for States, organizations and citizens. This research contributes to the critical assessment of the new industrial policies currently being developed from the point of view of their characteristics and the way they have been designed.

  12. Regulation of cellular function via electromagnetic field frequency and extracellular environment: A theoretical- experimental approach

    Taghian, Toloo; Sheikh, Abdul; Narmoneva, Daria; Kogan, Andrei


    Application of external electric field (EF) as a non-pharmacological, non-invasive tool to control cell function is of great therapeutic interest. We developed a theoretical-experimental approach to investigate the biophysical mechanisms of EF interaction with cells in electrode-free physiologically-relevant configuration. Our numerical results demonstrated that EF frequency is the major parameter to control cell response to EF. Non-oscillating or low-frequency EF leads to charge accumulation on the cell surface membrane that may mediate membrane initiated cell responses. In contrast, high-frequency EF penetrates the cell membrane and reaches cell cytoplasm, where it may directly activate intracellular responses. The theoretical predictions were confirmed in our experimental studies of the effects of applied EF on vascular cell function. Results show that non-oscillating EF increases vascular endothelial growth factor (VEGF) expression while field polarity controls cell adhesion rate. High-frequency, but not low frequency, EF provides differential regulation of cytoplasmic focal adhesion kinase and VEGF expression depending on the substrate, with increased expression in cells cultured on RGD-rich synthetic hydrogels, and decreased expression for matrigel culture. The authors acknowledge the financial support from the NSF (DMR-1206784 & DMR-0804199 to AK); the NIH (1R21 DK078814-01A1 to DN) and the University of Cincinnati (Interdisciplinary Faculty Research Support Grant to DN and AK).

  13. Macroscopic analysis of gas-jet wiping: Numerical simulation and experimental approach

    Lacanette, Delphine; Gosset, Anne; Vincent, Stéphane; Buchlin, Jean-Marie; Arquis, Éric


    Coating techniques are frequently used in industrial processes such as paper manufacturing, wire sleeving, and in the iron and steel industry. Depending on the application considered, the thickness of the resulting substrate is controlled by mechanical (scraper), electromagnetic (if the entrained fluid is appropriated), or hydrodynamic (gas-jet wiping) operations. This paper deals with the latter process, referred to as gas-jet wiping, in which a turbulent slot jet is used to wipe the coating film dragged by a moving substrate. This mechanism relies on the gas-jet-liquid film interaction taking place on the moving surface. The aim of this study is to compare the results obtained by a lubrication one-dimensional model, numerical volume of fluid-large eddy simulation (VOF-LES) modeling and an experimental approach. The investigation emphasizes the effect of the controlling wiping parameters, i.e., the pressure gradient and shear stress distributions induced by the jet, on the shape of the liquid film. Those profiles obtained experimentally and numerically for a jet impinging on a dry fixed surface are compared. The effect of the substrate motion and the presence of the dragged liquid film on these actuators are analyzed through numerical simulations. Good agreement is found between the film thickness profile in the wiping zone obtained from the VOF-LES simulations and with the analytical model, provided that a good model for the wiping actuators is used. The effect of the gas-jet nozzle to substrate standoff distance on the final coating thickness is analyzed; the experimental and predicted values are compared for a wide set of conditions. Finally, the occurrence of the splashing phenomenon, which is characterized by the ejection of droplets from the runback film flow at jet impingement, thus limiting the wiping process, is investigated through experiments and numerical simulations.

  14. Influence of the reinforcement corrosion on the bending moment capacity of reinforced concrete beams: a structural reliability approach

    E. A. P. Liberati

    Full Text Available Reinforced concrete structures are, certainly, one of the most used types of structure around world. When it is located in non-aggressive environments, it respects, in general, the structural life predicted. Unless the structure be used improperly. However, the durability of these structures is strongly connected to degradation processes whose origin is environmental and/or functional. Among these processes, it is worth to mention those related to corrosion of reinforcements. The reinforcement's corrosion is directly related to the durability and safety of concrete structures. Moreover, the chlorides diffusion is recognized as one of major factors that triggers the corrosion. Therefore, at modelling accurately the chloride diffusion, the corrosion of reinforcements can be better evaluated. Consequently, design criteria can be more realistically proposed in order to assure safety and economy into reinforced concrete structures. Due to the inherent randomness present on chloride diffusion and corrosion, these phenomena can only be properly modelled considering probabilistic approaches. In this paper, the durability of a beam designed using the criteria proposed by ABNT NBR 6118:2003 [1] is assessed using probabilistic approaches. The corrosion time initiation is determined using Fick's diffusion law whereas Faraday's corrosion laws are adopted to model the steel loss. The probability of structural failure is determined using Monte Carlo simulation. The mentioned beam is analysed considering different failure scenarios in order to study the influence of water/cement ratio and environmental aggressiveness on the probability of failure. Based on these results, some remarks are performed considering NBR recommendations and the real probability of failure.

  15. Meta-analysis method for discovering reliable biomarkers by integrating statistical and biological approaches: An application to liver toxicity.

    Cho, Hyeyoung; Kim, Hyosil; Na, Dokyun; Kim, So Youn; Jo, Deokyeon; Lee, Doheon


    Biomarkers that are identified from a single study often appear to be biologically irrelevant or false positives. Meta-analysis techniques allow integrating data from multiple studies that are related but independent in order to identify biomarkers across multiple conditions. However, existing biomarker meta-analysis methods tend to be sensitive to the dataset being analyzed. Here, we propose a meta-analysis method, iMeta, which integrates t-statistic and fold change ratio for improved robustness. For evaluation of predictive performance of the biomarkers identified by iMeta, we compare our method with other meta-analysis methods. As a result, iMeta outperforms the other methods in terms of sensitivity and specificity, and especially shows robustness to study variance increase; it consistently shows higher classification accuracy on diverse datasets, while the performance of the others is highly affected by the dataset being analyzed. Application of iMeta to 59 drug-induced liver injury studies identified three key biomarker genes: Zwint, Abcc3, and Ppp1r3b. Experimental evaluation using RT-PCR and qRT-PCR shows that their expressional changes in response to drug toxicity are concordant with the result of our method. iMeta is available at

  16. An Integrated Experimental and Modeling Approach to Predict Sediment Mixing from Benthic Burrowing Behavior.

    Roche, Kevin R; Aubeneau, Antoine F; Xie, Minwei; Aquino, Tomás; Bolster, Diogo; Packman, Aaron I


    Bioturbation is the dominant mode of sediment transport in many aquatic environments and strongly influences both sediment biogeochemistry and contaminant fate. Available bioturbation models rely on highly simplified biodiffusion formulations that inadequately capture the behavior of many benthic organisms. We present a novel experimental and modeling approach that uses time-lapse imagery to directly relate burrow formation to resulting sediment mixing. We paired white-light imaging of burrow formation with fluorescence imaging of tracer particle redistribution by the oligochaete Lumbriculus variegatus. We used the observed burrow formation statistics and organism density to parametrize a parsimonious model for sediment mixing based on fundamental random walk theory. Worms burrowed over a range of times and depths, resulting in homogenization of sediments near the sediment-water interface, rapid nonlocal transport of tracer particles to deep sediments, and large areas of unperturbed sediments. Our fundamental, parsimonious random walk model captures the central features of this highly heterogeneous sediment bioturbation, including evolution of the sediment-water interface coupled with rapid near-surface mixing and anomalous late-time mixing resulting from infrequent, deep burrowing events. This approach provides a general, transferable framework for explicitly linking sediment transport to governing biophysical processes.


    Wang Chao; Wang Pei-fang; Li Yong


    A composite modeling approach was presented for simulating the three-dimensional (3-D) subsurface transport of dissolved contaminants with transformation products. The approach was based on vertical infiltration and contaminant transport in the unsaturated zone and 3-D groundwater flow and contaminant migration in the saturated zone. Moisture movement and groundwater flow were considered to be steady, but contaminant transport was treated as continuous and transient. The transformed unsaturated zone and saturated zone transport equations were solved numerically with different techniques. The model contains a 3-D solution to flow and transport in the saturated zone, as well as two-dimensional solutions to vertical cross-sectional and areal scenarios. In order to verify the composite modeling, extensive experiments were conducted in two large-scale variable-slope soil tanks of 1200cm in length, 150cm in width and 150cm in height. Sand soil was filled in the tank as the porous media. The solutions of KBr or NH4C1 were introduced into each soil tank from 80×37cm2 area sources located on the top of the porous medium. Results from numerical simulations were compared with the data from tests. Predicted results are in good agreement with the experimental data.

  18. Comparison of Modeling and Experimental Approaches for Improved Modeling of Filtration in Granular and Consolidated Media

    Mirabolghasemi, M.; Prodanovic, M.; DiCarlo, D. A.


    Filtration is relevant to many disciplines from colloid transport in environmental engineering to formation damage in petroleum engineering. In this study we compare the results of the novel numerical modeling of filtration phenomenon on pore scale with the complementary experimental observations on laboratory scale and discuss how the results of comparison can be used to improve macroscale filtration models for different porous media. The water suspension contained glass beads of 200 micron diameter and flows through a packing of 1mm diameter glass beads, and thus the main filtration mechanism is straining and jamming of particles. The numerical model simulates the flow of suspension through a realistic 3D structure of an imaged, disordered sphere pack, which acts as the filter medium. Particle capture through size exclusion and jamming is modeled via a coupled Discrete Element Method (DEM) and Computational Fluid Dynamics (CFD) approach. The coupled CFD-DEM approach is capable of modeling the majority of particle-particle, particle-wall, and particle-fluid interactions. Note that most of traditional approaches require spherical particles both in suspension and the filtration medium. We adapted the interface between the pore space and the spherical grains to be represented as a triangulated surface and this allows extensions to any imaged media. The numerical and experimental results show that the filtration coefficient of the sphere pack is a function of the flow rate and concentration of the suspension, even for constant total particle flow rate. An increase in the suspension flow rate results in a decrease in the filtration coefficient, which suggests that the hydrodynamic drag force plays the key role in hindering the particle capture in random sphere packs. Further, similar simulations of suspension flow through a sandstone sample, which has a tighter pore space, show that filtration coefficient remains almost constant at different suspension flow rates. This

  19. A theoretical and experimental approach for correlating nanoparticle structure and electrocatalytic activity.

    Anderson, Rachel M; Yancey, David F; Zhang, Liang; Chill, Samuel T; Henkelman, Graeme; Crooks, Richard M


    The objective of the research described in this Account is the development of high-throughput computational-based screening methods for discovery of catalyst candidates and subsequent experimental validation using appropriate catalytic nanoparticles. Dendrimer-encapsulated nanoparticles (DENs), which are well-defined 1-2 nm diameter metal nanoparticles, fulfill the role of model electrocatalysts. Effective comparison of theory and experiment requires that the theoretical and experimental models map onto one another perfectly. We use novel synthetic methods, advanced characterization techniques, and density functional theory (DFT) calculations to approach this ideal. For example, well-defined core@shell DENs can be synthesized by electrochemical underpotential deposition (UPD), and the observed deposition potentials can be compared to those calculated by DFT. Theory is also used to learn more about structure than can be determined by analytical characterization alone. For example, density functional theory molecular dynamics (DFT-MD) was used to show that the core@shell configuration of Au@Pt DENs undergoes a surface reconstruction that dramatically affects its electrocatalytic properties. A separate Pd@Pt DENs study also revealed reorganization, in this case a core-shell inversion to a Pt@Pd structure. Understanding these types of structural changes is critical to building correlations between structure and catalytic function. Indeed, the second principal focus of the work described here is correlating structure and catalytic function through the combined use of theory and experiment. For example, the Au@Pt DENs system described earlier is used for the oxygen reduction reaction (ORR) as well as for the electro-oxidation of formic acid. The surface reorganization predicted by theory enhances our understanding of the catalytic measurements. In the case of formic acid oxidation, the deformed nanoparticle structure leads to reduced CO binding energy and therefore

  20. Maximum entropy approach for batch-arrival queue under N policy with an un-reliable server and single vacation

    Ke, Jau-Chuan; Lin, Chuen-Horng


    We consider the M[x]/G/1 queueing system, in which the server operates N policy and a single vacation. As soon as the system becomes empty the server leaves for a vacation of random length V. When he returns from the vacation and the system size is greater than or equal to a threshold value N, he starts to serve the waiting customers. If he finds fewer customers than N. he waits in the system until the system size reaches or exceeds N. The server is subject to breakdowns according to a Poisson process and his repair time obeys an arbitrary distribution. We use maximum entropy principle to derive the approximate formulas for the steady-state probability distributions of the queue length. We perform a comparative analysis between the approximate results with established exact results for various batch size, vacation time, service time and repair time distributions. We demonstrate that the maximum entropy approach is efficient enough for practical purpose and is a feasible method for approximating the solution of complex queueing systems.

  1. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample - A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain.

    Sörman, Karolina; Nilsonne, Gustav; Howner, Katarina; Tamm, Sandra; Caman, Shilan; Wang, Hui-Xin; Ingvar, Martin; Edens, John F; Gustavsson, Petter; Lilienfeld, Scott O; Petrovic, Predrag; Fischer, Håkan; Kristiansson, Marianne


    Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R) is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R's reliability, construct validity and factor structure in non-criminal individuals (N = 227) in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i) the Psychopathy Checklist: Screening Version (PCL:SV), (ii) self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii) additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively). Overall, the results demonstrated solid reliability (test-retest and internal consistency) and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R.

  2. Reliability and Construct Validity of the Psychopathic Personality Inventory-Revised in a Swedish Non-Criminal Sample - A Multimethod Approach including Psychophysiological Correlates of Empathy for Pain.

    Karolina Sörman

    Full Text Available Cross-cultural investigation of psychopathy measures is important for clarifying the nomological network surrounding the psychopathy construct. The Psychopathic Personality Inventory-Revised (PPI-R is one of the most extensively researched self-report measures of psychopathic traits in adults. To date however, it has been examined primarily in North American criminal or student samples. To address this gap in the literature, we examined PPI-R's reliability, construct validity and factor structure in non-criminal individuals (N = 227 in Sweden, using a multimethod approach including psychophysiological correlates of empathy for pain. PPI-R construct validity was investigated in subgroups of participants by exploring its degree of overlap with (i the Psychopathy Checklist: Screening Version (PCL:SV, (ii self-rated empathy and behavioral and physiological responses in an experiment on empathy for pain, and (iii additional self-report measures of alexithymia and trait anxiety. The PPI-R total score was significantly associated with PCL:SV total and factor scores. The PPI-R Coldheartedness scale demonstrated significant negative associations with all empathy subscales and with rated unpleasantness and skin conductance responses in the empathy experiment. The PPI-R higher order Self-Centered Impulsivity and Fearless Dominance dimensions were associated with trait anxiety in opposite directions (positively and negatively, respectively. Overall, the results demonstrated solid reliability (test-retest and internal consistency and promising but somewhat mixed construct validity for the Swedish translation of the PPI-R.

  3. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C


    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  4. Health Monitoring of a Rotating Disk Using a Combined Analytical-Experimental Approach

    Abdul-Aziz, Ali; Woike, Mark R.; Lekki, John D.; Baaklini, George Y.


    Rotating disks undergo rigorous mechanical loading conditions that make them subject to a variety of failure mechanisms leading to structural deformities and cracking. During operation, periodic loading fluctuations and other related factors cause fractures and hidden internal cracks that can only be detected via noninvasive types of health monitoring and/or nondestructive evaluation. These evaluations go further to inspect material discontinuities and other irregularities that have grown to become critical defects that can lead to failure. Hence, the objectives of this work is to conduct a collective analytical and experimental study to present a well-rounded structural assessment of a rotating disk by means of a health monitoring approach and to appraise the capabilities of an in-house rotor spin system. The analyses utilized the finite element method to analyze the disk with and without an induced crack at different loading levels, such as rotational speeds starting at 3000 up to 10 000 rpm. A parallel experiment was conducted to spin the disk at the desired speeds in an attempt to correlate the experimental findings with the analytical results. The testing involved conducting spin experiments which, covered the rotor in both damaged and undamaged (i.e., notched and unnotched) states. Damaged disks had artificially induced through-thickness flaws represented in the web region ranging from 2.54 to 5.08 cm (1 to 2 in.) in length. This study aims to identify defects that are greater than 1.27 cm (0.5 in.), applying available means of structural health monitoring and nondestructive evaluation, and documenting failure mechanisms experienced by the rotor system under typical turbine engine operating conditions.

  5. Numerical and experimental approaches to study soil transport and clogging in granular filters

    Kanarska, Y.; Smith, J. J.; Ezzedine, S. M.; Lomov, I.; Glascoe, L. G.


    Failure of a dam by erosion ranks among the most serious accidents in civil engineering. The best way to prevent internal erosion is using adequate granular filters in the transition areas where important hydraulic gradients can appear. In case of cracking and erosion, if the filter is capable of retaining the eroded particles, the crack will seal and the dam safety will be ensured. Numerical modeling has proved to be a cost-effective tool for improving our understanding of physical processes. Traditionally, the consideration of flow and particle transport in porous media has focused on treating the media as continuum. Practical models typically address flow and transport based on the Darcy's law as a function of a pressure gradient and a medium-dependent permeability parameter. Additional macroscopic constitutes describe porosity, and permeability changes during the migration of a suspension through porous media. However, most of them rely on empirical correlations, which often need to be recalibrated for each application. Grain-scale modeling can be used to gain insight into scale dependence of continuum macroscale parameters. A finite element numerical solution of the Navier-Stokes equations for fluid flow together with Lagrange multiplier technique for solid particles was applied to the simulation of soil filtration in the filter layers of gravity dam. The numerical approach was validated through comparison of numerical simulations with the experimental results of base soil particle clogging in the filter layers performed at ERDC. The numerical simulation correctly predicted flow and pressure decay due to particle clogging. The base soil particle distribution was almost identical to those measured in the laboratory experiment. It is believed that the agreement between simulations and experimental data demonstrates the applicability of the proposed approach for prediction of the soil transport and clogging in embankment dams. To get more precise understanding of

  6. Synthesis of designed materials by laser-based direct metal deposition technique: Experimental and theoretical approaches

    Qi, Huan

    Direct metal deposition (DMD), a laser-cladding based solid freeform fabrication technique, is capable of depositing multiple materials at desired composition which makes this technique a flexible method to fabricate heterogeneous components or functionally-graded structures. The inherently rapid cooling rate associated with the laser cladding process enables extended solid solubility in nonequilibrium phases, offering the possibility of tailoring new materials with advanced properties. This technical advantage opens the area of synthesizing a new class of materials designed by topology optimization method which have performance-based material properties. For better understanding of the fundamental phenomena occurring in multi-material laser cladding with coaxial powder injection, a self-consistent 3-D transient model was developed. Physical phenomena including laser-powder interaction, heat transfer, melting, solidification, mass addition, liquid metal flow, and species transportation were modeled and solved with a controlled-volume finite difference method. Level-set method was used to track the evolution of liquid free surface. The distribution of species concentration in cladding layer was obtained using a nonequilibrium partition coefficient model. Simulation results were compared with experimental observations and found to be reasonably matched. Multi-phase material microstructures which have negative coefficients of thermal expansion were studied for their DMD manufacturability. The pixel-based topology-optimal designs are boundary-smoothed by Bezier functions to facilitate toolpath design. It is found that the inevitable diffusion interface between different material-phases degrades the negative thermal expansion property of the whole microstructure. A new design method is proposed for DMD manufacturing. Experimental approaches include identification of laser beam characteristics during different laser-powder-substrate interaction conditions, an

  7. Geomechanics of penetration : experimental and computational approaches : final report for LDRD project 38718.

    Hardy, Robert Douglas; Holcomb, David Joseph; Gettemy, Glen L.; Fossum, Arlo Frederick; Rivas, Raul R.; Bronowski, David R.; Preece, Dale S.


    The purpose of the present work is to increase our understanding of which properties of geomaterials most influence the penetration process with a goal of improving our predictive ability. Two primary approaches were followed: development of a realistic, constitutive model for geomaterials and designing an experimental approach to study penetration from the target's point of view. A realistic constitutive model, with parameters based on measurable properties, can be used for sensitivity analysis to determine the properties that are most important in influencing the penetration process. An immense literature exists that is devoted to the problem of predicting penetration into geomaterials or similar man-made materials such as concrete. Various formulations have been developed that use an analytic or more commonly, numerical, solution for the spherical or cylindrical cavity expansion as a sort of Green's function to establish the forces acting on a penetrator. This approach has had considerable success in modeling the behavior of penetrators, both as to path and depth of penetration. However the approach is not well adapted to the problem of understanding what is happening to the material being penetrated. Without a picture of the stress and strain state imposed on the highly deformed target material, it is not easy to determine what properties of the target are important in influencing the penetration process. We developed an experimental arrangement that allows greater control of the deformation than is possible in actual penetrator tests, yet approximates the deformation processes imposed by a penetrator. Using explosive line charges placed in a central borehole, we loaded cylindrical specimens in a manner equivalent to an increment of penetration, allowing the measurement of the associated strains and accelerations and the retrieval of specimens from the more-or-less intact cylinder. Results show clearly that the deformation zone is highly concentrated

  8. Comprehensive multiphase NMR spectroscopy: Basic experimental approaches to differentiate phases in heterogeneous samples

    Courtier-Murias, Denis; Farooq, Hashim; Masoom, Hussain; Botana, Adolfo; Soong, Ronald; Longstaffe, James G.; Simpson, Myrna J.; Maas, Werner E.; Fey, Michael; Andrew, Brian; Struppe, Jochem; Hutchins, Howard; Krishnamurthy, Sridevi; Kumar, Rajeev; Monette, Martine; Stronks, Henry J.; Hume, Alan; Simpson, André J.


    Heterogeneous samples, such as soils, sediments, plants, tissues, foods and organisms, often contain liquid-, gel- and solid-like phases and it is the synergism between these phases that determine their environmental and biological properties. Studying each phase separately can perturb the sample, removing important structural information such as chemical interactions at the gel-solid interface, kinetics across boundaries and conformation in the natural state. In order to overcome these limitations a Comprehensive Multiphase-Nuclear Magnetic Resonance (CMP-NMR) probe has been developed, and is introduced here, that permits all bonds in all phases to be studied and differentiated in whole unaltered natural samples. The CMP-NMR probe is built with high power circuitry, Magic Angle Spinning (MAS), is fitted with a lock channel, pulse field gradients, and is fully susceptibility matched. Consequently, this novel NMR probe has to cover all HR-MAS aspects without compromising power handling to permit the full range of solution-, gel- and solid-state experiments available today. Using this technology, both structures and interactions can be studied independently in each phase as well as transfer/interactions between phases within a heterogeneous sample. This paper outlines some basic experimental approaches using a model heterogeneous multiphase sample containing liquid-, gel- and solid-like components in water, yielding separate 1H and 13C spectra for the different phases. In addition, 19F performance is also addressed. To illustrate the capability of 19F NMR soil samples, containing two different contaminants, are used, demonstrating a preliminary, but real-world application of this technology. This novel NMR approach possesses a great potential for the in situ study of natural samples in their native state.

  9. Does Improved Water Access Increase Child School Attendance? A Quasi-Experimental Approach From Rural Ethiopia

    Masuda, Y.; Cook, J.


    not measure the portion of children that engage in both activities. Indeed, children may very well be "attending" school according to an enrollment measure, but they may be doing so at low rates that prevent them from advancing to higher grade levels. Although enrollment rates may remain constant pre- and post-water access, school attendance may increase with the provision of water. This paper overcomes previous limitations by utilizing panel data from a quasi-experimental study and a continuous measure for school attendance collected over one year via random school attendance checks. In total, we collected data on 642 children from randomly selected households. Using a difference-in-difference estimator, our preliminary analysis finds that water access increases school attendance by 6% and is statistically significant at the 5% significance level. When using school enrollment as the outcome variable preliminary analysis finds that water access increases enrollment by 3%, although it is only marginally significant at the 10% significance level. Data on schooling via random school attendance checks provide a more reliable measure for the true impact of water access on schooling, and our preliminary findings suggest that the impact may be higher than previously estimated.

  10. A Dynamical Reliability Prediction Algorithm for Composite Service

    Chunli Xie


    Full Text Available Dynamic selection and dynamic binding and rebinding at runtime are new characters of composite services. The traditional static reliability prediction models are unsuitable to dynamic composite services. A new reliability predicting algorithm for composite services is proposed in this paper. Firstly, a composite service is decomposed into composition unites (executing path, composite module and atomic service according to their constituents. Consequently, a hierarchical graph of all composite units is constructed. Lastly, a new dynamic reliability prediction algorithm is presented. Comparing with the traditional reliability model, the new dynamic reliability approach is more flexible, which does not recompute reliability for all composite units and only computes the reliability of the effected composite units. In addition, an example to show how to measure the reliability based on our algorithm is designed. The experimental results show our proposed methods can give an accurate estimation of reliability. Furthermore, a more flexible sensitivity analysis is performed to determine which service component has the most significant impact on the improvement of composite service reliability.

  11. Surface enhanced Raman spectroscopic studies on aspirin : An experimental and theoretical approach

    Premkumar, R.; Premkumar, S.; Rekha, T. N.; Parameswari, A.; Mathavan, T.; Benial, A. Milton Franklin


    Surface enhanced Raman scattering (SERS) studies on aspirin molecule adsorbed on silver nanoparticles (AgNPs) were investigated by experimental and density functional theory approach. The AgNPs were synthesized by the solution-combustion method and characterized by the X-ray diffraction and high resolution-transmission electron microscopy techniques. The averaged particle size of synthesized AgNPs was calculated as ˜55 nm. The normal Raman spectrum (nRs) and SERS spectrum of the aspirin were recorded. The molecular structure of the aspirin and aspirin adsorbed on silver cluster were optimized by the DFT/ B3PW91 method with LanL2DZ basis set. The vibrational frequencies were calculated and assigned on the basis of potential energy distribution calculation. The calculated nRs and SERS frequencies were correlated well with the observed frequencies. The flat-on orientation was predicted from the nRs and SERS spectra, when the aspirin adsorbed on the AgNPs. Hence, the present studies lead to the understanding of adsorption process of aspirin on the AgNPs, which paves the way for biomedical applications.

  12. Mechanisms-based viscoplasticity: Theoretical approach and experimental validation for steel 304L

    Zubelewicz, Aleksander; Oliferuk, Wiera


    We propose a mechanisms-based viscoplasticity approach for metals and alloys. First, we derive a stochastic model for thermally-activated motion of dislocations and, then, introduce power-law flow rules. The overall plastic deformation includes local plastic slip events taken with an appropriate weight assigned to each angle of the plane misorientation from the direction of maximum shear stress. As deformation progresses, the material experiences successive reorganizations of the slip systems. The microstructural evolution causes that a portion of energy expended on plastic deformation is dissipated and the rest is stored in the defect structures. We show that the reorganizations are stable in a homogeneously deformed material. The concept is tested for steel 304L, where we reproduce experimentally obtained stress-strain responses, we construct the Frost-Ashby deformation map and predict the rate of the energy storage. The storage is assessed in terms of synchronized measurements of temperature and displacement distributions on the specimen surface during tensile loading.

  13. Experimental approaches for the study of oxytocin and vasopressin gene expression in the central nervous system

    Scordalakes, Elka M.; Yue, Chunmei; Gainer, Harold


    Intron-specific probes measure heteronuclear RNA (hnRNA) levels and thus approximate the transcription rates of genes, in part because of the rapid turnover of this intermediate form of RNA in the cell nucleus. Previously, we used oxytocin (Oxt)- and vasopressin (Avp)- intron-specific riboprobes to measure changes in Oxt and Avp hnRNA levels in the supraoptic nucleus (SON) by quantitative in situ hybridization (ISH) after various classical physiological perturbations, including acute and chronic salt loading, and lactation. In the present experiments, we used a novel experimental model to study the neurotransmitter regulation of Oxt and Avp gene expression in the rat SON in vivo. Bilateral cannulae connected via tubing to Alzet osmotic mini-pumps were positioned over the SON. In every experiment, one SON was infused with PBS and served as the control SON in each animal, and the contralateral SON received infusions of various neurotransmitter agonists and antagonists. Using this approach, we found that Avp but not Oxt gene expression increased after acute (2–5 h) combined excitatory amino acid agonist and GABA antagonist treatment, similar to what we found after an acute hyperosmotic stimulus. Since both OXT and AVP are known to be comparably and robustly secreted in response to acute osmotic stimuli in vivo and glutamate agonists in vitro, our results indicate a dissociation between OXT secretion and Oxt gene transcription in vivo. PMID:18655870

  14. Surface enhanced Raman spectroscopic studies on aspirin : An experimental and theoretical approach

    Premkumar, R.; Premkumar, S.; Parameswari, A.; Mathavan, T.; Benial, A. Milton Franklin, E-mail: [Department of Physics, N.M.S.S.V.N College, Madurai-625019, Tamilnadu, India. (India); Rekha, T. N. [PG and Research Department of Physics, Lady Doak College, Madurai-625 002, Tamilnadu, India. (India)


    Surface enhanced Raman scattering (SERS) studies on aspirin molecule adsorbed on silver nanoparticles (AgNPs) were investigated by experimental and density functional theory approach. The AgNPs were synthesized by the solution-combustion method and characterized by the X-ray diffraction and high resolution-transmission electron microscopy techniques. The averaged particle size of synthesized AgNPs was calculated as ∼55 nm. The normal Raman spectrum (nRs) and SERS spectrum of the aspirin were recorded. The molecular structure of the aspirin and aspirin adsorbed on silver cluster were optimized by the DFT/ B3PW91 method with LanL2DZ basis set. The vibrational frequencies were calculated and assigned on the basis of potential energy distribution calculation. The calculated nRs and SERS frequencies were correlated well with the observed frequencies. The flat-on orientation was predicted from the nRs and SERS spectra, when the aspirin adsorbed on the AgNPs. Hence, the present studies lead to the understanding of adsorption process of aspirin on the AgNPs, which paves the way for biomedical applications.

  15. Meeting Report on Experimental Approaches to Evolution and Ecology Using Yeast and Other Model Systems.

    Jarosz, Daniel; Dudley, Aimée M


    The fourth EMBO-sponsored conference on Experimental Approaches to Evolution and Ecology Using Yeast and Other Model Systems (, was held at the EMBL in Heidelberg, Germany, October 19-23, 2016. The conference was organized by Judith Berman (Tel Aviv University), Maitreya Dunham (University of Washington), Jun-Yi Leu (Academia Sinica), and Lars Steinmetz (EMBL Heidelberg and Stanford University). The meeting attracted ~120 researchers from 28 countries and covered a wide range of topics in the fields of genetics, evolutionary biology, and ecology with a unifying focus on yeast as a model system. Attendees enjoyed the Keith Haring inspired yeast florescence microscopy artwork (Figure 1), a unique feature of the meeting since its inception, and the one-minute flash talks that catalyzed discussions at two vibrant poster sessions. The meeting coincided with the 20th anniversary of the publication describing the sequence of the first eukaryotic genome, Saccharomyces cerevisiae (Goffeau et al. 1996). Many of the conference talks focused on important questions about what is contained in the genome, how genomes evolve, and the architecture and behavior of communities of phenotypically and genotypically diverse microorganisms. Here, we summarize highlights of the research talks around these themes. Nearly all presentations focused on novel findings, and we refer the reader to relevant manuscripts that have subsequently been published. Copyright © 2017, G3: Genes, Genomes, Genetics.

  16. A systematic experimental approach to the resin transfer molding of graphite reinforced tubes

    Senibi, S.; Sadler, R. [North Carolina A and T State Univ., Greensboro, NC (United States)


    The aim of this paper is to explore a systematic approach to experimentally quantify process variable in Resin Transfer Molding (RTM) of composite panels. Experience from resulting experiments were then used to aid in the fabrication of graphite/epoxy tubes. Composite Panels were fabricated from 8-Harness graphite weave preform using RTM. The process parameters for individual panels were varied in order to understand their effect on the quality of the panels produced. Three variations of RTM technique were employed for this study. The regular RTM (RRTM), the Vacuum Assist RTM (VARTM) and the Controlled Leak Vacuum Assist RTM (CLVRTM). The mold unit employs a steel cavity and a top plate made of transparent Plexiglass. The resin matrix was Dow Chemical Tactix 123, with Millamine 5260 as the curing agent. The resin flow and void formation mechanism during resin injection, were recorded with a video camera through the transparent mold face. Void and microvoid content were determined using the non-destructive testing of ultrasonic C-scanning. The tube making process employs some of the useful techniques resulting from the fabrication of the panels. The mold unit consist of a single tubular mold, and two end flanges made of steel. Internal pressure was provided through an inflatable silicone rubber bladder and resin injected into the mold with the aid of a peristaltic pump. The tube produced using a vacuum assist RTM were void-free, but there was the problem of non uniform wall thickness.

  17. Comparison of chemical and thermal protein denaturation by combination of computational and experimental approaches. II

    Wang, Qian; Christiansen, Alexander; Samiotakis, Antonios; Wittung-Stafshede, Pernilla; Cheung, Margaret S.


    Chemical and thermal denaturation methods have been widely used to investigate folding processes of proteins in vitro. However, a molecular understanding of the relationship between these two perturbation methods is lacking. Here, we combined computational and experimental approaches to investigate denaturing effects on three structurally different proteins. We derived a linear relationship between thermal denaturation at temperature Tb and chemical denaturation at another temperature Tu using the stability change of a protein (ΔG). For this, we related the dependence of ΔG on temperature, in the Gibbs-Helmholtz equation, to that of ΔG on urea concentration in the linear extrapolation method, assuming that there is a temperature pair from the urea (Tu) and the aqueous (Tb) ensembles that produces the same protein structures. We tested this relationship on apoazurin, cytochrome c, and apoflavodoxin using coarse-grained molecular simulations. We found a linear correlation between the temperature for a particular structural ensemble in the absence of urea, Tb, and the temperature of the same structural ensemble at a specific urea concentration, Tu. The in silico results agreed with in vitro far-UV circular dichroism data on apoazurin and cytochrome c. We conclude that chemical and thermal unfolding processes correlate in terms of thermodynamics and structural ensembles at most conditions; however, deviations were found at high concentrations of denaturant.

  18. Theoretical and Experimental Approaches to the Dark Energy and the Cosmological Constant Problem

    Borzou, Ahmad


    Theoretical and Experimental Approaches to the Dark Energy and theCosmological Constant ProblemAhmad Borzou, Ph.D.Advisor: Kenichi Hatakeyama, Ph.D.The cosmological constant problem is one of the most pressing problems ofphysics at this time. In this dissertation the problem and a set of widely-discussedtheoretical solutions to this problem are reviewed. It is shown that a recently developed Lorentz gauge theory of gravity can provide a natural solution. In this theorypresented here, the metric is not dynamical and it is shown that the Schwartzschildmetric is an exact solution. Also, it is proven that the de Sitter space is an exactvacuum solution and as a result the theory is able to explain the expansion of theuniverse with no need for dark energy. Renormalizability of the theory is studied aswell. It is also shown that, under a certain condition, the theory is power-countingrenormalizable.Supersymmetry provides an alternative solution to the cosmological problem aswell. The idea behind supersymmetry is rev...

  19. Experimental validation of the filtering approach for dose monitoring in proton therapy at low energy.

    Attanasi, F; Belcari, N; Camarda, M; Del Guerra, A; Moehrs, S; Rosso, V; Vecchio, S; Lanconelli, N; Cirrone, G A P; Di Rosa, F; Russo, G


    The higher physical selectivity of proton therapy demands higher accuracy in monitoring of the delivered dose, especially when the target volume is located next to critical organs and a fractionated therapy is applied. A method to verify a treatment plan and to ensure the high quality of the hadrontherapy is to use Positron Emission Tomography (PET), which takes advantage of the nuclear reactions between protons and nuclei in the tissue during irradiation producing beta(+)-emitting isotopes. Unfortunately, the PET image is not directly proportional to the delivered radiation dose distribution; this is the reason why, at the present time, the verification of depth dose profiles with PET techniques is limited to a comparison between the measured activity and the one predicted for the planned treatment by a Monte Carlo model. In this paper we test the feasibility of a different scheme, which permits to reconstruct the expected PET signal from the planned radiation dose distribution along beam direction in a simpler and more direct way. The considered filter model, based on the description of the PET image as a convolution of the dose distribution with a filter function, has already demonstrated its potential applicability to beam energies above 70 MeV. Our experimental investigation provides support to the possibility of extending the same approach to the lower energy range ([40, 70] MeV), in the perspective of its clinical application in eye proton therapy.

  20. Experimental testicular tissue banking to generate spermatogenesis in the future: A multidisciplinary team approach.

    Sadri-Ardekani, Hooman; McLean, Thomas W; Kogan, Stanley; Sirintrapun, Joseph; Crowell, Kathryn; Yousif, Mustafa Q; Hodges, Steve J; Petty, John; Pranikoff, Thomas; Sieren, Leah; Zeller, Kristen; Atala, Anthony


    Spermatogonial stem cell (SSC) loss due to cancer treatment, developmental disorder or genetic abnormality may cause permanent infertility. Cryopreservation of ejaculated sperm is an effective method of fertility preservation in adult males at risk of infertility. However this is not an option in pre-pubertal boys because spermatogenesis has not yet started, and it is difficult in adolescents who are not sexually mature. Therefore testicular tissue cryopreservation to preserve SSCs for future generation of spermatogenesis, either in vivo or in vitro, could be an option for these groups of patients. Although SSC transplantation has been successful in several species including non-human primates, it is still experimental in humans. There are several remaining concerns which need to be addressed before initiating trials of human SSC autotransplantation. Establishment of a testicular tissue banking system is a fundamental step towards using SSC technology as a fertility preservation method. It is important to understand the consultation, harvesting the testicular tissue, histological evaluation, cryopreservation, and long term storage aspects. We describe here a multidisciplinary approach to establish testicular tissue banking for males at risk of infertility.

  1. TWSME of a NiTi strip in free bending conditions: experimental and theoretical approach

    A. Fortini


    Full Text Available This paper deals with the two-way shape memory effect (TWSME induced on a strip of a nearequiatomic NiTi alloy by means of the shape memory cycling training method. This procedure is based on the deformation in martensite state to reach the desired cold shape followed by cycling the temperature from above Af to below Mf. To this end, the sample was thermally treated to memorise a bent shape, thermomechanical trained as described and thermally cycled in unloaded conditions in order to study the stability of the induced TWSME. Heating to Af was reached by a hot air stream flow whereas cooling to Mf was achieved through natural convection. The evolution of the curvature with the increasing number of cycles was evaluated. The thermomechanical behaviour of the strip undergoing uniform bending was simulated using a one-dimensional phenomenological model based on stress and the temperature as external control variables. Both martensite and austenite volume fractions were chosen as internal parameters and kinetic laws were used in order to describe their evolution during phase transformations. The experimental findings are compared with the model simulation and a numerical prediction based on the approach proposed in [25].

  2. Development of the Digestive System-Experimental Challenges and Approaches of Infant Lipid Digestion.

    Abrahamse, Evan; Minekus, Mans; van Aken, George A; van de Heijning, Bert; Knol, Jan; Bartke, Nana; Oozeer, Raish; van der Beek, Eline M; Ludwig, Thomas


    At least during the first 6 months after birth, the nutrition of infants should ideally consist of human milk which provides 40-60 % of energy from lipids. Beyond energy, human milk also delivers lipids with a specific functionality, such as essential fatty acids (FA), phospholipids, and cholesterol. Healthy development, especially of the nervous and digestive systems, depends fundamentally on these. Epidemiological data suggest that human milk provides unique health benefits during early infancy that extend to long-lasting benefits. Preclinical findings show that qualitative changes in dietary lipids, i.e., lipid structure and FA composition, during early life may contribute to the reported long-term effects. Little is known in this respect about the development of digestive function and the digestion and absorption of lipids by the newborn. This review gives a detailed overview of the distinct functionalities that dietary lipids from human milk and infant formula provide and the profound differences in the physiology and biochemistry of lipid digestion between infants and adults. Fundamental mechanisms of infant lipid digestion can, however, almost exclusively be elucidated in vitro. Experimental approaches and their challenges are reviewed in depth.

  3. Approaching prehistoric skills: experimental drilling in the context of bead manufacturing

    Maria Gurova


    Full Text Available From the very Early Neolithic in the Balkans two categories of objects are recognized as having been involved in prehistoric drilling activities. The first is beads and other decorative and prestigious items made of bone, shell, pottery and various minerals. The second comprises toolkits of micro-perforators/borers found among the flint assemblages of several sites. This paper presents experiments in drilling different materials with the aim of testing several practical issues. A series of micro-borers were produced and used for manual and mechanical drilling (with a pump drill. Various samples (mainly prepared thin plates of minerals and rocks were used, ranging in hardness (on Mohs scale from 3 (marble, limestone, calcite to 6.5 (amazonite, nephrite. Biominerals were also used: aragonite (shells and apatite (bones. Actual bead production was approached by manufacturing 16 delicate beads of 5 different materials using fine sand and water abrasion. Though not conclusive, the experimental work was instructive in many of the parameters, procedures and technical details of prehistoric drilling.

  4. Optogenetic Evocation of Field Inhibitory Postsynaptic Potentials in Hippocampal Slices: A Simple and Reliable Approach for Studying Pharmacological Effects on GABAA and GABAB Receptor-Mediated Neurotransmission

    Julien eDine


    Full Text Available The GABAergic system is the main source of inhibition in the mammalian brain. Consequently, much effort is still made to develop new modulators of GABAergic synaptic transmission. In contrast to glutamatergic postsynaptic potentials (PSPs, accurate monitoring of GABA receptor-mediated PSPs (GABAR-PSPs and their pharmacological modulation in brain tissue invariably requires the use of intracellular recording techniques. However, these techniques are expensive, time- and labor-consuming, and, in case of the frequently employed whole-cell patch-clamp configuration, impact on intracellular ion concentrations, signaling cascades, and pH buffering systems. Here, we describe a novel approach to circumvent these drawbacks. In particular, we demonstrate in mouse hippocampal slices that selective optogenetic activation of interneurons leads to prominent field inhibitory GABAAR- and GABABR-PSPs in area CA1 which are easily and reliably detectable by a single extracellular recording electrode. The field PSPs exhibit typical temporal and pharmacological characteristics, display pronounced paired-pulse depression, and remain stable over many consecutive evocations. Additionally validating the methodological value of this approach, we further show that the neuroactive steroid 5-THDOC (5 µM shifts the inhibitory GABAAR-PSPs towards excitatory ones.

  5. A comprehensive theoretical, numerical and experimental approach for crack detection in power plant rotating machinery

    Stoisser, C. M.; Audebert, S.


    In order to describe the state-of-the-art on cracked rotor related problems, the current work presents the comprehensive theoretical, numerical and experimental approach adopted by EDF for crack detection in power plant rotating machinery. The work mainly focuses on the theoretical cracked beam model developed in the past years by S. Andrieux and C. Varé and associates both numerical and experimental aspects related to the crack detection problem in either turboset or turbo pump units. The theoretical part consists of the derivation of a lumped cracked beam model from the three-dimensional formulation of the general problem of elasticity with unilateral contact conditions on the crack lips, valid for any shape and number of cracks in the beam section and extended to cracks not located in a cross-section. This leads to the assessment of the cracked beam rigidity as a function of the rotation angle, in case of pure bending load or bending plus shear load. In this way the function can be implemented in a 1D rotordynamics code. An extension of the cracked beam model taking into account the torsion behaviour is also proposed. It is based on the assumption of full adherence between crack lips, when the crack closes, and on an incremental formulation of deformation energy. An experimental validation has been carried out using different cracked samples, both in static and dynamic configurations, considering one or three elliptic cracks in the same cross-section and helix-shaped cracks. Concerning the static configuration, a good agreement between numerical and experimental results is found. It is shown to be equal to 1% maximal gap of the beam deflection. Concerning the dynamical analysis, the main well-known indicator 2× rev. bending vibration component at half critical speed is approximated at maximum by 18% near the crack position. Our experiments also allowed for the observation of the bending and torsion resonance frequency shifts determined by the extra

  6. A microstructure sensitive study of rolling contact fatigue in bearing steels: A numerical and experimental approach

    Pandkar, Anup Surendra

    Bearings are an integral part of machine components that transmit rotary power such as cars, motors, engines etc. Safe bearing operation is essential to avoid serious failures and accidents, which necessitates their timely replacement. This calls for an accurate bearing life prediction methods. Based on the Lundberg-Palmgen (LP) model, current life models consistently under predict bearings lives. Improvement in life prediction requires understanding of the bearing failure mechanism i.e. Rolling Contact Fatigue (RCF). The goal of this research is to develop a mechanistic framework required for an improved bearing life prediction model. Such model should account for metal plasticity, influence of microstructural features and cyclically evolving stressstrain fields induced during RCF. To achieve this, elastic-plastic finite element (FE) study is undertaken to investigate the response of M50-NiL bearing steel during RCF. Specifically, a microstructure sensitive study of the influence of non-metallic inclusions on RCF response of bearings is presented. M50-NiL microstructure consists of carbides which are orders of magnitude smaller than bearing dimensions. To account for this size difference, a multi-scale FE modeling approach is employed. The FE results reveal that hard carbide particles act as local stress risers, alter surrounding stressstrain fields and cause micro-scale yielding of steel matrix. Moreover, they introduce a shear stress cycle with non-zero mean stress, which promotes micro-plastic strain accumulation via ratcheting mechanism. Localized ratcheting is primarily responsible for cyclic hardening within the RCF affected region. Such evolution of subsurface hardness can be used to quantify RCF induced damage. To investigate this further, cyclic hardening response of the RCF affected region is simulated. The results show good agreement with the experimental observations. The cyclic stress-strain fields obtained from these simulations and the knowledge of

  7. Otimização do desempenho de amplificadores de radiofrequência banda larga: uma abordagem experimental Radio frequency broadband amplifiers performance optimization: an experimental approach

    Vanessa Borsato de Souza Lima


    Full Text Available Neste trabalho buscou-se avaliar o comportamento de amplificadores de radiofrequência (RF de potência em aplicações de telecomunicações. Uma estratégia experimental foi empregada resultando na otimização de um conjunto de fatores responsáveis pelo aumento da linearidade e eficiência geral do amplificador durante o processo produtivo, reduzindo com isto as intermodulações geradas e a interferência em serviços adjacentes. Estes resultados permitiram a manufatura de amplificadores banda larga de alta eficiência, garantindo assim maior produtividade e confiabilidade.This paper evaluates radio frequency (RF power amplifiers behavior for telecommunications applications. An experimental strategy was employed resulting in the optimization of a number of factors responsible for increasing the overall efficiency and linearity of amplifiers during production process, reducing total intermodulation and interference in adjacent channels. These results enabled broadband amplifiers manufacturing with high efficiency, thus ensuring increased productivity and reliability.

  8. Microelectronics Reliability


    convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them. This report was cleared for public release...testing for reliability prediction of devices exhibiting multiple failure mechanisms. Also presented was an integrated accelerating and measuring ...13  Table 2  T, V, F and matrix versus  measured  FIT

  9. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry


    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  10. Development and Comparison of Techniques for Generating Permeability Maps using Independent Experimental Approaches

    Hingerl, Ferdinand; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally


    We have developed and evaluated methods for creating voxel-based 3D permeability maps of a heterogeneous sandstone sample using independent experimental data from single phase flow (Magnetic Resonance Imaging, MRI) and two-phase flow (X-ray Computed Tomography, CT) measurements. Fluid velocities computed from the generated permeability maps using computational fluid dynamics simulations fit measured velocities very well and significantly outperform empirical porosity-permeability relations, such as the Kozeny-Carman equation. Acquiring images on the meso-scale from porous rocks using MRI has till recently been a great challenge, due to short spin relaxation times and large field gradients within the sample. The combination of the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme with three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE) - a technique recently developed at the UNB MRI Center - can overcome these challenges and enables obtaining quantitative 3 dimensional maps of porosities and fluid velocities. Using porosity and (single-phase) velocity maps from MRI and (multi-phase) saturation maps from CT measurements, we employed three different techniques to obtain permeability maps. In the first approach, we applied the Kozeny-Carman relationship to porosities measured using MRI. In the second approach, we computed permeabilities using a J-Leverett scaling method, which is based on saturation maps obtained from N2-H2O multi-phase experiments. The third set of permeabilities was generated using a new inverse iterative-updating technique, which is based on porosities and measured velocities obtained in single-phase flow experiments. The resulting three permeability maps provided then input for computational fluid dynamics simulations - employing the Stanford CFD code AD-GPRS - to generate velocity maps, which were compared to velocity maps measured by MRI. The J-Leveret scaling method and the iterative-updating method

  11. Stochastic Approach for Modeling of DNAPL Migration in Heterogeneous Aquifers: Model Development and Experimental Data Generation

    Dean, D. W.; Illangasekare, T. H.; Turner, A.; Russell, T. F.


    Modeling of the complex behavior of DNAPLs in naturally heterogeneous subsurface formations poses many challenges. Even though considerable progress have been made in developing improved numerical schemes to solve the governing partial differential equations, most of these methods still rely on deterministic description of the processes. This research explores the use of stochastic differential equations to model multiphase flow in heterogeneous aquifers, specifically the flow of DNAPLs in saturated soils. The models developed are evaluated using experimental data generated in two-dimensional test systems. A fundamental assumption used in the model formulation is that the movement of a fluid particle in each phase is described by a stochastic process and that the positions of all fluid particles over time are governed by a specific law. It is this law, which we seek to determine. The approach results in a nonlinear stochastic differential equation describing the position of the non-wetting phase fluid particle. The nonlinearity in the stochastic differential equation arises because both the drift and diffusion coefficients depend on the volumetric fraction of the phase, which in turn depends on the position of the fluid particles in the problem domain. The concept of a fluid particle is central to the development of the proposed model. Expressions for both saturation and volumetric fraction are developed using this concept of fluid particle. Darcy's law and the continuity equation are used to derive a Fokker-Planck equation governing flow. The Ito calculus is then applied to derive a stochastic differential equation(SDE) for the non-wetting phase. This SDE has both drift and diffusion terms which depend on the volumetric fraction of the non-wetting phase. Standard stochastic theories based on the Ito calculus and the Wiener process and the equivalent Fokker-Planck PDE's are typically used to model diffusion processes. However, these models, in their usual form

  12. Steam-driven explosions at Solfatara volcano, Campi Flegrei: new insights from an experimental approach

    Montanaro, Cristian; Bettina, Scheu; Klaus, Mayer; Giovanni, Orsi; Moretti, Roberto; Dingwell Donald, B.


    The Solfatara crater is a highly active hydrothermal site located in the central part of the Campi Flegrei Caldera (south-central Italy). Campi Flegrei is one of most active calderas in the world, characterized by intense unrest episodes involving massive ground deformation, high seismicity and continuous gas emissions from the Solfatara crater. These episodes are thought to be driven by the complex interaction between a deep magmatic source and a shallow hydrothermal system [Orsi et al., 1999]. The most recent unrest episode started in 2006, exhibiting an increase in the degassing activity, especially in the Pisciarelli field (SE of Solfatara crater). In such an active magmato-hydrothermal system steam-driven explosive eruptions (phreatic or hydrothermal) are a likely potential hazard - one that is difficult to predict in terms of timing and magnitude. Here we present an experimental approach based on a rapid decompression experiments to investigate the different scenarios likely for steam explosions in the Solfatara area. The experimental setup produces fragmentation precipitated by the release of Argon gas overpressure and assisted by water-to-steam flashing within the connected pore space of the tested samples. We have investigated varying P-T conditions and varying gas-to-liquid ratios. The experimental conditions used in this case study mimic those of a mixing zone present at the base of the hydrothermal system below Solfatara at a depth between 1000 and 1500 m (15-25 MPa) and temperatures from 270°C to 300°C [Caliro et al., 2007]. Neapolitan Yellow Tuff is used as sample material for the study as it is the stratigraphic unit expected at this depth in this region [Orsi et al. 1996]. Sensors monitor temperature and pressure evolution during the experiments, enabling the determination of the speed of fragmentation. A high-speed camera (10000 fps) is used to measure the ejection velocities of the gas-particle mixtures. The fragments generated are recovered and

  13. Percutaneous transcholecystic approach for an experiment of biliary stent placement: an experimental study in dogs

    Seo, Tae Seok [Medical School of Gachon, Inchon (Korea, Republic of); Song, Ho Young; Lim, Jin Oh; Ko, Gi Young; Sung, Kyu Bo; Kim, Tae Hyung; Lee, Ho Jung [College of Medicine, Ulsan Univ., Seoul (Korea, Republic of)


    To determine, in an experimental study of biliary stent placement, the usefulness and safety of the percutaneous transcholecystic approach and the patency of a newly designed biliary stent. A stent made of 0.15-mm-thick nitinol wire, and 10 mm in diameter and 2 cm in length, was loaded in an introducer with an 8-F outer diameter. The gallbladders of seven mongrel dogs were punctured with a 16-G angiocath needle under sonographic guidance, and cholangiography was performed. After anchoring the anterior wall of the gallbladder to the abdominal wall using a T-fastener, the gallbladder body was punctured again under fluoroscopic guidance. The cystic and common bile ducts were selected using a 0.035-inch guide wire and a cobra catheter, and the stent was placed in the common bile duct. Post-stenting cholangiography was undertaken, and an 8.5-F drainage tube was inserted in the gallbladder. Two dogs were followed-up and sacrificed at 2,4 and 8 weeks after stent placement, respectively, and the other expired 2 days after stent placement. Follow-up cholangiograms were obtained before aninmal was sacrified, and a pathologic examination was performed. Stent placement was technically successful in all cases. One dog expired 2 days after placement because of bile peritonitis due to migration of the drainage tube into the peritoneal cavity, but the other six remained healthy during the follow-up period. Cholangiography performed before the sacrifice of each dog showed that the stents were patent. Pathologic examination revealed the proliferation of granulation tissue at 2 weeks, and complete endothelialization over the stents by granulation tissue at 8 weeks. Percutaneous transcholecystic biliary stent placement appears to be safe, easy and useful. After placement, the stent was patent during the follow-period.

  14. MYRRHA, a Pb-Bi experimental ADS: specific approach to radiation protection aspects.

    Abderrahim, H Aït; Aoust, Th; Malambu, E; Sobolev, V; Van Tichelen, K; De Bruyn, D; Maes, D; Haeck, W; Van den Eynde, G


    Since 1998, SCK*CEN, in partnership with IBA s.a. and many European research laboratories, is designing a multipurpose accelerator driven system (ADS) for Research and Development (R&D) applications-MYRRHA-and is conducting an associated R&D support programme. MYRRHA is an ADS under development at Mol in Belgium and is aiming to serve as a basis for the European experimental ADS to provide protons and neutrons for various R&D applications. It consists of a proton accelerator delivering a 350 MeV x 5 mA proton beam to a liquid Pb-Bi spallation target that in turn couples to a Pb-Bi cooled, subcritical fast core. In the first stage, the project focuses mainly on demonstration of the ADS concept, safety research on sub-critical systems and nuclear waste transmutation studies. In a later stage, the device will also be dedicated to research on structural materials, nuclear fuel, liquid metal technology and associated aspects, and on sub-critical reactor physics. Subsequently, it will be used for research on applications such as radioisotope production. A first preliminary conceptual design file of MYRRHA was completed by the end of 2001 and has been reviewed by an International Technical Guidance Committee, which concluded that there are no show stoppers in the project and even though some topics such as the safety studies and the fuel qualification need to be addressed more deeply before concluding it. In this paper, we are reporting on the state-of-the art of the MYRRHA project at the beginning of 2004 and in particular on the radiation shielding assessment and the radiation protection particular aspects through a remote handling operation approach in order to minimise the personnel exposure to radiation.

  15. An experimental approach for science laboratories in the contest of science of primary education degree course

    Giacomo Bozzo


    Full Text Available International literature in science education has shown the importance of introducing scientific studies in primary school, in order to give pupils competences and skills necessary for their life. Consequently, prospective primary teachers need to improve their scientific knowledge and to plan new experimental activities for primary school students. In this context, we have planned an educational learning path for prospective primary teachers, focused on specific conceptual knows of kinematics. The proposed activities are based on an empirical approach, avoiding in the first step any formal introduction of the observed phenomena, which could be difficult to understand, especially for students of science of primary education degree course. In this context, the educational technologies have given a fundamental support, since they have offered to prospective teachers the possibility to focus their attention only on the involved physics concepts and principles.Un approccio sperimentale per i laboratori scientifici nei corsi di laurea in Scienze della Formazione PrimariaStudi recenti hanno evidenziato la necessità di introdurre le discipline scientifiche sin dalla scuola primaria, per consentire alle future generazioni di vivere in modo critico e consapevole nel mondo reale. I futuri insegnanti hanno, quindi, la necessità di approfondire le loro conoscenze in ambito scientifico e di progettare valide attività laboratoriali per gli alunni di scuola primaria. In questo contesto si inserisce la nostra azione formativa rivolta agli studenti di Scienze della Formazione Primaria (SFP, mirata ad affrontare alcune difficoltà nell’apprendimento della Fisica, ben note in letteratura. L’attività didattica proposta è basata sull’osservazione diretta dei fenomeni fisici, evitando, nella fase iniziale, ogni genere di introduzione formale che possa risultare di difficile comprensione per gli studenti di SFP. Il supporto delle tecnologie didattiche ha

  16. Trust, reciprocity and collective action to fight antibiotic resistance. An experimental approach.

    Rönnerstrand, Björn; Andersson Sundell, Karolina


    Antibiotic resistance is a collective action dilemma. Individuals may request antibiotics, but an overall reduction in use is necessary to limit resistance. A reoccurring theoretical claim is that social capital increase cooperation in social dilemmas. The aim of this paper is to investigate the link between generalized trust and reciprocity and the willingness to postpone antibiotic treatment in order to limit overuse in a scenario-based study. A between-subject scenario experimental approach with hypothetical scenarios was utilized. Participants were asked to imagine that they were seeing a doctor for a respiratory infection. The doctor prescribes antibiotics, but advise postponing therapy to see if the disease resolves by itself, for the sake of limiting overuse. Respondents were asked to answer how long they could accept postponing antibiotic treatment, from 0 to 7 days. The number of days that most people would be able to accept postponing treatment was considered the between-subject factor. In total, the study sample included 981 respondents with a mean age of 51 years. A majority of respondents were men (65.7%). The mean number of days that the respondents stated they were willing to postpone antibiotic treatment was positively associated with the number of days the respondents were told that most people were willing to postpone antibiotic treatment, p antibiotic treatment and generalized trust, p = 0.001. In conclusion, the results showed that the proclaimed public willingness to postpone therapy influenced a respondent's willingness to postpone antibiotic therapy in different scenarios. Also, generalized trust was positively associated with the willingness to postpone therapy.

  17. Bedform genesis and evolution in bedrock substrates: a new experimental approach

    Parsons, D. R.; Yin, N.; Peakall, J.


    Most previous studies on the genesis and evolution of bedforms have focused on aggradational bedforms within cohesionless sediments, with very few investigations that concern either erosive bedform genesis and evolution or bedrock channel abrasion processes. The study presented here details experiments that involve the genesis and formation of erosional bedform features within natural (soft clay) cohesive sediment beds and analogue bedrock substrates by modelling clay under the effect of both open-channel plain water flows, and sediment-laden flows. A new approach without using plaster-of-Paris or real bedrock developed provides a feasible method to simulate the genesis and evolution of the erosional bedforms in cohesive sediment beds and sculpted forms in bedrock channels on relatively short time-scales in the laboratory by using a realistic substrate substitute.A series of flume experiments are presented herein where the undrained shear strength of two different kinds of substrate material is systematically varied under constant flow conditions. Experiments using plain water flow indicated that erosive bedforms in cohesive sediment substrate cannot be produced only under the effect of sediment-free flow. Particulate-laden flows do form erosional bedforms in both kinds of clay beds and the shear strength of the bed material plays a key role in determining the diversity of erosional features forming on such substrates. Optimisation of modelling clay beds has enabled us to successfully replicate a suite of bedrock bedforms, including potholes, flutes, longitudinal furrows, etc., that have clear equivalents to those observed in bedrock rivers and contributed to investigate the genesis and evolution process of them and explore the flow structures within and above them in experimental analogue bedrock substrate for the first time.

  18. Sensory perception in cetaceans: Part II – Promising experimental approaches to study chemoreception in dolphins

    Dorothee eKremers


    Full Text Available Chemosensory perception in cetaceans remains an intriguing issue as morphological, neuroanatomical and genetic studies draw unclear conclusions, while behavioral data suggest that dolphins may use it for food selection or socio-sexual interactions. Experimental approaches have been scarce due to the practical difficulties of testing chemoreception in wild dolphins. Go/no-go tasks are one elegant way to investigate discrimination abilities; however, they require to train the animals, thus preventing spontaneous responses and hence the expression of preferences. Here, we aimed at testing potential spontaneous responses to chemical stimuli and developed novel procedures. First, we conducted a study to test whether captive dolphins respond to a biologically relevant smell. Therefore, we placed dead fish within an opaque barrel at the border of the pool and counted the number of respirations at proximity as an indicator of investigation. The same dead fishes were presented several times during experiments lasting three consecutive days. From the second day on (i.e. when the odor composition changed, dolphins breathed more often close to the fish-smelling barrel than close to the visually identical but empty control barrel. Second, we conducted a study to test whether dolphins are able to discriminate food flavors. Captive dolphins are commonly provided with ice cubes as a source of enrichment. We took this opportunity to provide ice cubes with different flavors and to compare the reaction to these different flavors as a measure of discrimination. Hence, we used the latency of return to the ice cube begging spot as a measure of discrimination from the previous ice cube flavor. Thus, our method used a non-invasive and easily replicable technique based on the spontaneous begging responses of dolphins toward more or less attractive items bearing biological relevance. The procedures used enabled us to show that dolphins may discriminate odors and flavors

  19. Open Experimentation on Phenomena of Chemical Reactions via the Learning Company Approach in Early Secondary Chemistry Education

    Beck, Katharina; Witteck, Torsten; Eilks, Ingo


    Presented is a case study on the implementation of open and inquiry-type experimentation in early German secondary chemistry education. The teaching strategy discussed follows the learning company approach. Originally adopted from vocational education, the learning company method is used to redirect lab-oriented classroom practice towards a more…

  20. Uncertainty-accounted calculational-experimental approach for improved conservative evaluations of VVER RPV radiation loading parameters

    Borodkin, P.G.; Borodkin, G.I.; Khrennikov, N.N. [Scientific and Engineering Centre for Nuclear and Radiation Safety SEC NRS, Building 5, Malaya Krasnoselskaya Street, 2/8, 107140 Moscow (Russian Federation)


    The approach of improved uncertainty-accounted conservative evaluation of vodo-vodyanoi energetichesky reactor (VVER) (reactor-) pressure-vessel (RPV) radiation loading parameters has been proposed. This approach is based on the calculational-experimental procedure, which takes into account C/E ratio, depending on over- or underestimation, and uncertainties of measured and calculated results. An application of elaborated approach to the full-scale ex-vessel neutron dosimetry experiments on Russian VVERs combined with neutron-transport calculations has been demonstrated in the paper. (authors)

  1. How reliable are gas-phase proton affinity values of small carbanions? A comparison of experimental data with values calculated using Gaussian-3 and CBS compound methods

    Danikiewicz, Witold


    Gas-phase proton affinities (PA) of a series of 25 small, aliphatic carbanions were computed using different Gaussian-3 methods: G3, G3(B3LYP), G3(MP2) and G3(MP2, B3LYP) and Complete Basis Set Extrapolation methods: CBS-4M, CBS-Q, CBS-QB3, and CBS-APNO. The results were compared with critically selected experimental data. The analysis of the results shows that for the majority of the studied molecules all compound methods (Gaussian-3 and CBS), except for CBS-4M, give comparable results, which differ no more than +/-2 kcal mol-1 from the experimental data. Taking into account the calculation time, G3(MP2) and G3(MP2, B3LYP) methods offer the best compromise between accuracy and computational cost. As an additional proof, the results obtained by these two methods were compared with the values obtained using CCSD(T) ab initio method with large basis set. It was found also that some of the published experimental data are erroneous and should be corrected. The results described in this work show that for the majority of the studied compounds PA values calculated using compound methods can be used with the same or even higher confidence as the experimental ones because even the largest differences between Gaussian-3 and CBS methods listed above are still comparable with the accuracy of the typical PA measurements.

  2. Biological motivations in the neurodynamic structure of psychopathological states (experimental approach).

    Kotov, Aleksandar V


    Some trigger mechanisms of pathological and biological motivations were investigated in experimental models of animal behavior (feeding, drinking, stereotypic acts, experimental alcoholism, and so on), as well as high level of brain-specific molecular synthesis responsible for development of pathological motivations. Neurophysiological processes transforming some biological drives to pathological motivations are described and discussed.

  3. Water adsorption on charcoal: New approach in experimental studies and data representation

    Geynisman, M.; Walker, R.


    The experimental apparatus was built to study the H{sub 2}O adsorption on charcoal at very low concentrations and collect the data in the form of isosteres. Experimental method is discussed and the global three-dimensional fit is constructed to predict the post-regeneration conditions of charcoal absorbers. 11 refs.

  4. Experimental Investigation Related To Some Predicted Results Of Reliable High Frequency Radio Communication Links Between Benghazi-Libya And Cairo-Egypt.

    Mohamed Yousef Ahmed Abou-Hussein


    Full Text Available In this study, the central radio propagation laboratory (CRPL method of ionospheric prediction of the National Bureau of Standards (NBS in U.S.A was used in practical calculations of the optimal working frequencies for reliable high frequency (HF radio communication links between Benghazi-Libya and Cairo-Egypt. The results were drawn in the form of curves by using the computer. The computer was used to measure the received signal level variation of frequencies 11.980 MHz, 11.785 MHz which were transmitted with a power of 250 KW, 100 KW respectively from the Egypt Arabic Republic Broadcasting station in Cairo city, directed to the North Africa and South Europe regions. The measurements were taken during daytime's for winter (December, January& February and summer (June, July & August seasons.

  5. An experimental approach to constrain steam explosions at Solfatara volcano, Campi Flegrei

    Montanaro, C.; Scheu, B.; Mayer, K.; Orsi, G.; Dingwell, D. B.


    The Solfatara crater is a highly active hydrothermal site of Campi Flegrei caldera, one of the most dangerous volcanic areas in the world extending to the west of the city of Naples. Currently Campi Flegrei is in a phase of volcanic unrest. Since 2006 an increase of the degassing activity, especially in the Pisciarelli field (SE of Solfatara crater), has been recorded. In addition ground uplift at increasing rates (about 1 cm/month), with a relatively low seismic activity, is observed in the nearby area of Pozzuoli [Osservatorio Vesuviano bulletins]. This ongoing phase of unrest, as most of the last unrest episodes, is thought to be driven by the complex interaction between a deep magmatic source and the shallow hydrothermal system [Orsi et al., 1999]. In a such active hydrothermal and magmatic site steam-driven explosive eruptions (phreatic or hydrothermal) are likely to occur representing a potential hazard especially as they are difficult to predict in terms of timing and magnitude. Here we present an experimental approach based on a rapid decompression experiments to investigate different scenarios likely for steam explosions in the Solfatara area. The setup allows producing fragmentation from a combination of Argon gas overpressure and steam flashing within the connected pore space of the tested samples at varying PT-conditions and varying gas-to-liquid ratios. The experimental conditions used in this case study are mimicking those of a mixing zone present at the base of the hydrothermal system below Solfatara; here at a depth between 1000 and 1500 m (15-25 MPa) the temperature conditions range between 270°C and 375°C [Caliro et al., 2007]. Neapolitan Yellow Tuff is used as sample material for the study as it is the stratigraphic unit of the expected source region for steam explosions [Orsi et al. 1996]. Sensors monitor temperature and pressure evolution during experiments, allowing to determine the speed of fragmentation of the samples. A high-speed camera

  6. Spherical mechanism analysis of a surgical robot for minimally invasive surgery -- analytical and experimental approaches.

    Rosen, Jacob; Lum, Mitch; Trimble, Denny; Hannaford, Blake; Sinanan, Mika


    Recent advances in technology have led to the fusion of MIS techniques and robot devices. However, current systems are large and cumbersome. Optimizing the surgical robot mechanism will eventually lead to its integration into the operating room (OR) of the future becoming the extended presence of the surgeon and nurses in a room occupied by the patient alone. By optimizing a spherical mechanism using data collected in-vivo during MIS procedures, this study is focused on a bottom-up approach to developing a new class of surgical robotic arms while maximizing their performance and minimizing their size. The spherical mechanism is a rotational manipulator with all axes intersecting at the center of the sphere. Locating the rotation center of the mechanism at the MIS port makes this class of mechanism a suitable candidate for the first two links of a surgical robot for MIS. The required dexterous workspace (DWS) is defined as the region in which 95% of the tool motions are contained based on in-vivo measurements. The extended dexterous workspace (EDWS) is defined as the entire abdominal cavity reachable by a MIS instruments. The DWS is defined by a right circular cone with a vertex angle of 60 degrees and the EDWS is defined by a cone with an elliptical cross section created by two orthogonal vertex angles of 60 degrees and 90 degrees. A compound function based on the mechanism's isotropy and the mechanism stiffness was considered as the performance metric cost function. Optimization across both the DWS and the EDWS lead to a serial mechanism configuration with link length angles of 74 degrees and 60 degrees for a serial configuration. This mechanism configuration maximized the kinematic performance in the DWS while keeping the EDWS as its reachable workspace. Surgeons, using a mockup of two mechanisms in a MIS setup, validated these results experimentally. From these experiments the serial configuration was deemed most applicable for MIS robotic applications compared

  7. Experimental approach and modelling of the mechanical behaviour of graphite fuel elements subjected to compression pulses

    Forquin, P.


    -graphite and the corresponding standard deviation. The behaviour being non linear before failure, a numerical simulation has been conducted to build the relation between the applied load and the maximum tensile stress. A statistical approach applied to experimental data allows deducing the mean tensile strength (about 2.5 MPa) and the scatter of failure stresses (Weibull modulus m = 12).

  8. Experimental approach and modelling of the mechanical behaviour of graphite fuel elements subjected to compression pulses

    Forquin P.


    particulate-graphite and the corresponding standard deviation. The behaviour being non linear before failure, a numerical simulation has been conducted to build the relation between the applied load and the maximum tensile stress. A statistical approach applied to experimental data allows deducing the mean tensile strength (about 2.5 MPa and the scatter of failure stresses (Weibull modulus m = 12.

  9. Experimental Systems-Biology Approaches for Clostridia-Based Bioenergy Production

    Papoutsakis, Elefterios [Univ. of Delaware, Newark, DE (United States)


    This is the final project report for project "Experimental Systems-Biology Approaches for Clostridia-Based Bioenergy Production" for the funding period of 9/1/12 to 2/28/2015 (three years with a 6-month no-cost extension) OVERVIEW AND PROJECT GOALS The bottleneck of achieving higher rates and titers of toxic metabolites (such as solvents and carboxylic acids that can used as biofuels or biofuel precursors) can be overcome by engineering the stress response system. Thus, understanding and modeling the response of cells to toxic metabolites is a problem of great fundamental and practical significance. In this project, our goal is to dissect at the molecular systems level and build models (conceptual and quantitative) for the stress response of C. acetobutylicum (Cac) to its two toxic metabolites: butanol (BuOH) and butyrate (BA). Transcriptional (RNAseq and microarray based), proteomic and fluxomic data and their analysis are key requirements for this goal. Transcriptional data from mid-exponential cultures of Cac under 4 different levels of BuOH and BA stress was obtained using both microarrays (Papoutsakis group) and deep sequencing (RNAseq; Meyers and Papoutsakis groups). These two sets of data do not only serve to validate each other, but are also used for identification of stress-induced changes in transcript levels, small regulatory RNAs, & in transcriptional start sites. Quantitative proteomic data (Lee group), collected using the iTRAQ technology, are essential for understanding of protein levels and turnover under stress and the various protein-protein interactions that orchestrate the stress response. Metabolic flux changes (Antoniewicz group) of core pathways, which provide important information on the re-allocation of energy and carbon resources under metabolite stress, were examined using 13C-labelled chemicals. Omics data are integrated at different levels and scales. At the metabolic-pathway level, omics data are integrated into a 2nd generation genome

  10. Soil pH controls the environmental availability of phosphorus: Experimental and mechanistic modelling approaches

    Devau, Nicolas [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Cadre, Edith Le [Supagro, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Hinsinger, Philippe; Jaillard, Benoit [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France); Gerard, Frederic, E-mail: [INRA, UMR 1222 Eco and Sols - Ecologie Fonctionnelle et Biogeochimie des Sols (INRA-IRD-SupAgro), Place Viala, F-34060 Montpellier (France)


    Inorganic P is the least mobile major nutrient in most soils and is frequently the prime limiting factor for plant growth in terrestrial ecosystems. In this study, the extraction of soil inorganic P with CaCl{sub 2} (P-CaCl{sub 2}) and geochemical modelling were combined in order to unravel the processes controlling the environmentally available P (EAP) of a soil over a range of pH values (pH {approx} 4-10). Mechanistic descriptions of the adsorption of cations and anions by the soil constituents were used (1-pK Triple Plane, ion-exchange and NICA-Donnan models). These models are implemented into the geochemical code Visual MINTEQ. An additive approach was used for their application to the surface horizon of a Cambisol. The geochemical code accurately reproduced the concentration of extracted P at the different soil pH values (R{sup 2} = 0.9, RMSE = 0.03 mg kg{sup -1}). Model parameters were either directly found in the literature or estimated by fitting published experimental results in single mineral systems. The strong agreement between measurements and modelling results demonstrated that adsorption processes exerted a major control on the EAP of the soil over a large range of pH values. An influence of the precipitation of P-containing mineral is discounted based on thermodynamic calculations. Modelling results indicated that the variations in P-CaCl{sub 2} with soil pH were controlled by the deprotonation/protonation of the surface hydroxyl groups, the distribution of P surface complexes, and the adsorption of Ca and Cl from the electrolyte background. Iron-oxides and gibbsite were found to be the major P-adsorbing soil constituents at acidic and alkaline pHs, whereas P was mainly adsorbed by clay minerals at intermediate pH values. This study demonstrates the efficacy of geochemical modelling to understand soil processes, and the applicability of mechanistic adsorption models to a 'real' soil, with its mineralogical complexity and the additional

  11. Identifying and characterising the different structural length scales in liquids and glasses: an experimental approach.

    Salmon, Philip S; Zeidler, Anita


    The structure of several network-forming liquids and glasses is considered, where a focus is placed on the detailed information that is made available by using the method of neutron diffraction with isotope substitution (NDIS). In the case of binary network glass-forming materials with the MX2 stoichiometry (e.g. GeO2, GeSe2, ZnCl2), two different length scales at distances greater than the nearest-neighbour distance manifest themselves by peaks in the measured diffraction patterns. The network properties are influenced by a competition between the ordering on these "intermediate" and "extended" length scales, which can be manipulated by changing the chemical identity of the atomic constituents or by varying state parameters such as the temperature and pressure. The extended-range ordering, which describes the decay of the pair-correlation functions at large-r, can be represented by making a pole analysis of the Ornstein-Zernike equations, an approach that can also be used to describe the large-r behaviour of the pair-correlation functions for liquid and amorphous metals where packing constraints are important. The first applications are then described of the NDIS method to measure the detailed structure of aerodynamically-levitated laser-heated droplets of "fragile" glass-forming liquid oxides (CaAl2O4 and CaSiO3) at high-temperatures (~2000 K) and the structure of a "strong" network-forming glass (GeO2) under pressures ranging from ambient to ~8 GPa. The high-temperature experiments show structural changes on multiple length scales when the oxides are vitrified. The high-pressure experiment offers insight into the density-driven mechanisms of network collapse in GeO2 glass, and parallels are drawn with the high-pressure behaviour of silica glass. Finally, the hydrogen-bonded network of water is considered, where the first application of the method of oxygen NDIS is used to measure the structures of light versus heavy water and a difference of approximately equal

  12. ATLAS reliability analysis

    Bartsch, R.R.


    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  13. Mathematical reliability an expository perspective

    Mazzuchi, Thomas; Singpurwalla, Nozer


    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  14. Thermodynamics of protein-ligand interactions as a reference for computational analysis: how to assess accuracy, reliability and relevance of experimental data.

    Krimmer, Stefan G; Klebe, Gerhard


    For a conscientious interpretation of thermodynamic parameters (Gibbs free energy, enthalpy and entropy) obtained by isothermal titration calorimetry (ITC), it is necessary to first evaluate the experimental setup and conditions at which the data were measured. The data quality must be assessed and the precision and accuracy of the measured parameters must be estimated. This information provides the basis at which level discussion of the data is appropriate, and allows insight into the significance of comparisons with other data. The aim of this article is to provide the reader with basic understanding of the ITC technique and the experimental practices commonly applied, in order to foster an appreciation for how much measured thermodynamic parameters can deviate from ideal, error-free values. Particular attention is paid to the shape of the recorded isotherm (c-value), the influence of the applied buffer used for the reaction (protonation reactions, pH), the chosen experimental settings (temperature), impurities of protein and ligand, sources of systematic errors (solution concentration, solution activity, and device calibration) and to the applied analysis software. Furthermore, we comment on enthalpy-entropy compensation, heat capacities and van't Hoff enthalpies.

  15. Differential Gender Selection on Floral Size: An Experimental Approach Using Cistus salvifolius

    Montserrat Arista; Pedro Luis Ortiz


    ...-specific selection is scarce. 2. We experimentally altered flower size of the hermaphrodite Cistus salvifolius in three sites differing in physical characteristics, and measured the effect of this manipulation on both male and female success. 3...

  16. Consistency from the perspective of an experimental systems approach to the sciences and their epistemic objects

    Hans-Jörg Rheinberger


    Full Text Available It is generally accepted that the development of the modern sciences is rooted in experiment. Yet for a long time, experimentation did not occupy a prominent role, neither in philosophy nor in history of science. With the 'practical turn' in studying the sciences and their history, this has begun to change. This paper is concerned with systems and cultures of experimentation and the consistencies that are generated within such systems and cultures. The first part of the paper exposes the forms of historical and structural coherence that characterize the experimental exploration of epistemic objects. In the second part, a particular experimental culture in the life sciences is briefly described as an example. A survey will be given of what it means and what it takes to analyze biological functions in the test tube.

  17. Statistical Approaches in Analysis of Variance: from Random Arrangements to Latin Square Experimental Design


    Background: The choices of experimental design as well as of statisticalanalysis are of huge importance in field experiments. These are necessary tobe correctly in order to obtain the best possible precision of the results. Therandom arrangements, randomized blocks and Latin square designs werereviewed and analyzed from the statistical perspective of error analysis.Material and Method: Random arrangements, randomized block and Latinsquares experimental designs were used as field experiments. ...

  18. Comparing Different Approaches to Visualizing Light Waves: An Experimental Study on Teaching Wave Optics

    Mešic, Vanes; Hajder, Erna; Neumann, Knut; Erceg, Nataša


    Research has shown that students have tremendous difficulties developing a qualitative understanding of wave optics, at all educational levels. In this study, we investigate how three different approaches to visualizing light waves affect students' understanding of wave optics. In the first, the conventional, approach light waves are represented…

  19. Ultra reliability at NASA

    Shapiro, Andrew A.


    Ultra reliable systems are critical to NASA particularly as consideration is being given to extended lunar missions and manned missions to Mars. NASA has formulated a program designed to improve the reliability of NASA systems. The long term goal for the NASA ultra reliability is to ultimately improve NASA systems by an order of magnitude. The approach outlined in this presentation involves the steps used in developing a strategic plan to achieve the long term objective of ultra reliability. Consideration is given to: complex systems, hardware (including aircraft, aerospace craft and launch vehicles), software, human interactions, long life missions, infrastructure development, and cross cutting technologies. Several NASA-wide workshops have been held, identifying issues for reliability improvement and providing mitigation strategies for these issues. In addition to representation from all of the NASA centers, experts from government (NASA and non-NASA), universities and industry participated. Highlights of a strategic plan, which is being developed using the results from these workshops, will be presented.

  20. Linking soil chemistry, treeline shifts and climate change: scenario modeling using an experimental approach

    Mavris, Christian; Furrer, Gerhard; Anderson, Susanne; Blum, Alex; Wells, Aaron; Dahms, Dennis; Egli, Markus


    Climate change and global warming have a strong influence on the landscape development. As cold areas become warmer, both flora and fauna must adapt to new conditions (a). It is widely accepted that climate changes deeply influence the treeline shifts. In addition to that, wildfires, plant diseases and insect infestation (i.e. mountain pine beetle) can promote a selective replacement of plants, inhibiting some and favoring others, thus modifying the ecosystem in diverse ways. There is little knowledge on the behavior of soil chemistry when such changes occur. Will elemental availability become a crucial factor as a function of climate changes? The Sinks Canyon and Stough Basin - SE flank of the Wind River Range, Wyoming, USA - offer an ideal case study. Conceptually, the areas were divided into three main subsets: tundra, forest and a subarid environment. All soils were developed on granitoid moraines (b, c). From each subset, a liquid topsoil extract was produced and mixed with the solid subsoil samples in batch reactors at 50 °C. The batch experiments were carried out over 1800 h, and the progress of the dissolution was regularly monitored by analyzing liquid aliquots using IC and ICP-OES. The nutrients were mostly released within the first hours of the experiment. Silicon and Al were continuously released into the solution, while some alkali elements - i.e. Na - showed a more complex trend. Organic acids (acetic, citric) and other ligands produced during biodegradation played an active role in mineral dissolution and nutrient release. The mineral colloids detected in the extract (X-ray diffraction) can significantly control surface reactions (adsorption/desorption) and contributed to specific cationic concentrations. The experimental set up was then compared to a computed dissolution model using SerialSTEADYQL software (d, e). Decoding the mechanisms driving mineral weathering is the key to understand the main geochemical aspects of adaptation during climate

  1. Breakthrough behavior of granular ferric hydroxide (GFH) fixed-bed adsorption filters: modeling and experimental approaches.

    Sperlich, Alexander; Werner, Arne; Genz, Arne; Amy, Gary; Worch, Eckhard; Jekel, Martin


    Breakthrough curves (BTC) for the adsorption of arsenate and salicylic acid onto granulated ferric hydroxide (GFH) in fixed-bed adsorbers were experimentally determined and modeled using the homogeneous surface diffusion model (HSDM). The input parameters for the HSDM, the Freundlich isotherm constants and mass transfer coefficients for film and surface diffusion, were experimentally determined. The BTC for salicylic acid revealed a shape typical for trace organic compound adsorption onto activated carbon, and model results agreed well with the experimental curves. Unlike salicylic acid, arsenate BTCs showed a non-ideal shape with a leveling off at c/c0 approximately 0.6. Model results based on the experimentally derived parameters over-predicted the point of arsenic breakthrough for all simulated curves, lab-scale or full-scale, and were unable to catch the shape of the curve. The use of a much lower surface diffusion coefficient D(S) for modeling led to an improved fit of the later stages of the BTC shape, pointing on a time-dependent D(S). The mechanism for this time dependence is still unknown. Surface precipitation was discussed as one possible removal mechanism for arsenate besides pure adsorption interfering the determination of Freundlich constants and D(S). Rapid small-scale column tests (RSSCT) proved to be a powerful experimental alternative to the modeling procedure for arsenic.

  2. Hydromechanical behavior of a quasi-saturated compacted soils on drying-wetting paths-experimental and numerical approaches

    Andriantrehina Soanarivo Rinah


    Full Text Available This paper presents an experimental and numerical investigation funded by the French National Project “Terredurable”, which is devoted to the study of soils in quasi-saturated state. The experimental study is focused on the behavior of compacted soils on drying-wetting paths and the macroscopic effect of the drying path on shrinkage and cracking. Furthermore, a protocol for image analysis of crack in drying tests was developed. Two approaches are used for the measurement of surface strains and identification of the ultimate stress before the formation of the first crack, using VIC-2D software, and for the monitoring of crack evolution, using ImageJ software. The aim of the numerical approach is to reproduce the drying experiments with a finite difference code (FLAC 3D, in order to understand the stress conditions that can explain crack initiation, without modeling the crack formation itself.

  3. Experimental studies of low salinity water flooding in carbonate reservoirs: A new promising approach

    Zahid, Adeel; Shapiro, Alexander; Skauge, Arne


    additional oil recovery can be achieved when successively flooding composite carbonate core plugs with various diluted versions of seawater. The experimental data on carbonates is very limited, so more data and better understanding of the mechanisms involved is needed to utilize this method for carbonate...... reservoirs. In this paper, we have experimentally investigated the oil recovery potential of low salinity water flooding for carbonate rocks. We used both reservoir carbonate and outcrop chalk core plugs. The flooding experiments were carried out initially with the seawater, and afterwards additional oil...... of experimental results, discussions are made about possible mechanisms for improving oil recovery in carbonate reservoir as a function of change in brine salinity. Copyright 2012, Society of Petroleum Engineers....

  4. Experimental and thermodynamic approach on proton exchange membrane fuel cell performance

    Miansari, Me. [Islamic Azad University Ghaemshahr, P.O. Box 163, Ghaemshahr (Iran); Sedighi, K.; Alizadeh, E.; Miansari, Mo. [Department of Mechanical Engineering, Noushirvani University of Technology, P.O. Box 484, Babol (Iran); Amidpour, M. [Department of Mechanical Engineering, K.N. Toosi University, Box 15875-4416, Tehran (Iran)


    The present work is employed in two sections. Firstly the effect of different parameters such as pressure, temperature and anode and cathode channel depth on the performance of the proton exchange membrane (PEM) fuel cell was experimentally studied. The experimental result shows a good accuracy compared to other works. Secondly a semi-empirical model of the PEM fuel cell has been developed. This model was used to study the effect of different operating conditions such as temperature, pressure and air stoichiometry on the exergy efficiencies and irreversibilities of the cell. The results show that the predicted polarization curves are in good agreement with the experimental data and a high performance was observed at the channel depth of 1.5 mm for the anode and 1 mm for the cathode. Furthermore the results show that increase in the operating temperature and pressure can enhance the cell performance, exergy efficiencies and reduce irreversibilities of the cell. (author)

  5. Experimental and thermodynamic approach on proton exchange membrane fuel cell performance

    Miansari, Me.; Sedighi, K.; Amidpour, M.; Alizadeh, E.; Miansari, Mo.

    The present work is employed in two sections. Firstly the effect of different parameters such as pressure, temperature and anode and cathode channel depth on the performance of the proton exchange membrane (PEM) fuel cell was experimentally studied. The experimental result shows a good accuracy compared to other works. Secondly a semi-empirical model of the PEM fuel cell has been developed. This model was used to study the effect of different operating conditions such as temperature, pressure and air stoichiometry on the exergy efficiencies and irreversibilities of the cell. The results show that the predicted polarization curves are in good agreement with the experimental data and a high performance was observed at the channel depth of 1.5 mm for the anode and 1 mm for the cathode. Furthermore the results show that increase in the operating temperature and pressure can enhance the cell performance, exergy efficiencies and reduce irreversibilities of the cell.

  6. Effect of Fast Moving Object on RSSI in WSN: An Experimental Approach

    Ahmed, Syed Hassan; Mehmood, Amjad; Javaid, Nadeem; Iwao, Sasase


    In this paper, we experimentally investigate the effect of fast moving object on the RSSI in the wireless sensor networks in presence of the ground effect and antenna orientation in elevation direction. In experimental setup, MICAz mote pair was placed on the ground, where one mote acts as a transmitter and the other as a receiver. The trans- mitter mote's antenna was oriented in elevation direction with respect to the receiver mote's antenna. The fast moving object i.e. car, was passed between the motes and the fluctuations in the RSSI are observed. The experimental results show some sequential pattern in RSSI fluctuations when car moves at some relatively slow speed. However, some irregu- larities were also observed when antenna was oriented at 45 and 90 in elevation direction.

  7. Comparing different approaches to visualizing light waves: An experimental study on teaching wave optics

    Mešić, Vanes; Hajder, Erna; Neumann, Knut; Erceg, Nataša


    Research has shown that students have tremendous difficulties developing a qualitative understanding of wave optics, at all educational levels. In this study, we investigate how three different approaches to visualizing light waves affect students' understanding of wave optics. In the first, the conventional, approach light waves are represented by sinusoidal curves. The second teaching approach includes representing light waves by a series of static images, showing the oscillating electric field vectors at characteristic, subsequent instants of time. Within the third approach phasors are used for visualizing light waves. A total of N =85 secondary school students were randomly assigned to one of the three teaching approaches, each of which lasted a period of four class hours. Students who learned with phasors and students who learned from the series of static images outperformed the students learning according to the conventional approach, i.e., they showed a much better understanding of basic wave optics, as measured by a conceptual survey administered to the students one week after the treatment. Our results suggest that visualizing light waves with phasors or oscillating electric field vectors is a promising approach to developing a deeper understanding of wave optics for students enrolled in conceptual level physics courses.

  8. Innovations in power systems reliability

    Santora, Albert H; Vaccaro, Alfredo


    Electrical grids are among the world's most reliable systems, yet they still face a host of issues, from aging infrastructure to questions of resource distribution. Here is a comprehensive and systematic approach to tackling these contemporary challenges.

  9. A Combined Numerical-Experimental Approach to Quantify the Thermal Contraction of A356 During Solidification

    Macht, J. P.; Maijer, D. M.; Phillion, A. B.


    A process for generating thermal contraction coefficients for use in the solidification modeling of aluminum castings is presented. Sequentially coupled thermal-stress modeling is used in conjunction with experimentation to empirically generate the thermal contraction coefficients for a strontium-modified A356 alloy. The impact of cooling curve analysis on the modeling procedure is studied. Model results are in good agreement with experimental findings, indicating a sound methodology for quantifying the thermal contraction. The technique can be applied to other commercially relevant aluminum alloys, increasing the utility of solidification modeling in the casting industry.

  10. A novel experimental approach for the detection of the dynamical Casimir effect

    Braggio, C. [Ferrara Univ. (Italy). Dipt. di Fisica; Bressi, G. [Istituto Nazionale di Fisica Nucleare, Pavia (Italy); Carugno, G.; Del Noce, C. [Istituto Nazionale di Fisica Nucleare, Padova (Italy); Galeazzi, G.; Lombardi, A.; Palmieri, A.; Ruoso, G. [Istituto Nazionale di Fisica Nucleare, LNL, Legnaro (Italy); Zanello, D. [Istituto Nazionale di Fisica Nucleare, Rome (Italy)


    In order to observe the Casimir radiation we propose a new experimental scheme with no mechanically moving mirror. In fact we estimate that the power required for a sustained mechanical vibration would be beyond present experimental possibilities. Our apparatus consists of a superconducting electromagnetic resonant cavity with a wall covered by a semiconductor layer whose reflectivity is driven by a laser at giga-hertz frequencies. The semiconductor thus acts as a moving mirror. Preliminary laboratory tests showed that a semiconductor can indeed reflect microwaves as efficiently as a conductor. In this paper we present the complete scheme that we intend to set up for the detection of the Casimir radiation. (authors)

  11. Response of cork compounds subjected to impulsive blast loads. An experimental and numerical approach

    Sousa-Martins, J.; Kakogiannis, D.; Coghe, F.; Reymen, B.; Teixeira-Dias, F.


    The experimental characterisation of the dynamic (blast) behaviour of micro-agglomerated cork is presented. A 4-cable ballistic pendulum system was used to measure the transmitted impulse to a target subjected to a shock-wave originated from the detonation of an explosive. The displacement of the pendulum and the transmitted impulse were measured and the coefficient of restitution of the whole system when the cork compound is used was determined. A numerical study of the problem using the finite element method gave very good correlation with the experimental analysis leading to the development of a material constitutive model suited to the dynamic behaviour of these cork compounds in such solicitations.

  12. An experimental approach to determine the heat transfer coefficient in directional solidification furnaces

    Banan, Mohsen; Gray, Ross T.; Wilcox, William R.


    The heat transfer coefficient between a molten charge and its surroundings in a Bridgman furnace was experimentally determined using in-situ temperature measurement. The ampoule containing an isothermal melt was suddenly moved from a higher temperature zone to a lower temperature zone. The temperature-time history was used in a lumped-capacity cooling model to evaluate the heat transfer coefficient between the charge and the furnace. The experimentally determined heat transfer coefficient was of the same order of magnitude as the theoretical value estimated by standard heat transfer calculations.

  13. Experimental and theoretical analyses of gate oxide and junction reliability for 4H-SiC MOSFET under short-circuit operation

    An, Junjie; Namai, Masaki; Iwamuro, Noriyuki


    In this study, the experimental evaluation and numerical analysis of the short-circuit capability of the 1200 V SiC MOSFET with a thin gate oxide layer were carried out. Two different failures, including the gate oxide breakdown and thermal runaway of the device caused by the high gate electric field and elevated lattice temperature, were initially investigated and their critical temperature points for two failure modes were accurately extrapolated by solving the thermal diffusion equation; the obtained results are in good agreement with simulation results. It was confirmed that short-circuit robustness depends not only on thermal properties of the material but also on dimensional parameters of the device and that the heat is the dominant factor that causes device failure during short-circuit transient.

  14. Grid reliability

    Saiz, P; Rocha, R; Andreeva, J


    We are offering a system to track the efficiency of different components of the GRID. We can study the performance of both the WMS and the data transfers At the moment, we have set different parts of the system for ALICE, ATLAS, CMS and LHCb. None of the components that we have developed are VO specific, therefore it would be very easy to deploy them for any other VO. Our main goal is basically to improve the reliability of the GRID. The main idea is to discover as soon as possible the different problems that have happened, and inform the responsible. Since we study the jobs and transfers issued by real users, we see the same problems that users see. As a matter of fact, we see even more problems than the end user does, since we are also interested in following up the errors that GRID components can overcome by themselves (like for instance, in case of a job failure, resubmitting the job to a different site). This kind of information is very useful to site and VO administrators. They can find out the efficien...

  15. An integrated approach for non-periodic dynamic response prediction of complex structures: Numerical and experimental analysis

    Rahneshin, Vahid; Chierichetti, Maria


    In this paper, a combined numerical and experimental method, called Extended Load Confluence Algorithm, is presented to accurately predict the dynamic response of non-periodic structures when little or no information about the applied loads is available. This approach, which falls into the category of Shape Sensing methods, inputs limited experimental information acquired from sensors to a mapping algorithm that predicts the response at unmeasured locations. The proposed algorithm consists of three major cores: an experimental core for data acquisition, a numerical core based on Finite Element Method for modeling the structure, and a mapping algorithm that improves the numerical model based on a modal approach in the frequency domain. The robustness and precision of the proposed algorithm are verified through numerical and experimental examples. The results of this paper demonstrate that without a precise knowledge of the loads acting on the structure, the dynamic behavior of the system can be predicted in an effective and precise manner after just a few iterations.

  16. Identification and induction of human, social, and cultural capitals through an experimental approach to stormwater management

    Decentralized stormwater management is based on the dispersal of stormwater management practices (SWMP) throughout a watershed to manage stormwater runoff volume and potentially restore natural hydrologic processes. This approach to stormwater management is increasingly popular b...

  17. Identification and induction of human, social, and cultural capitals through an experimental approach to stormwater management

    Decentralized stormwater management is based on the dispersal of stormwater management practices (SWMP) throughout a watershed to manage stormwater runoff volume and potentially restore natural hydrologic processes. This approach to stormwater management is increasingly popular b...

  18. Validation of engineering dynamic inflow models by experimental and numerical approaches

    Yu, W.; Hong, V. W.; Ferreira, C.; van Kuik, G. A. M.


    The state of the art engineering dynamic inflow models of Pitt-Peters, Øye and ECN have been used to correct Blade Element Momentum theory for unsteady load prediction of a wind turbine for two decades. However, their accuracy is unknown. This paper is to benchmark the performance of these engineering models by experimental and numerical methods. The experimental load and flow measurements of an unsteady actuator disc were performed in the Open Jet Facility at Delft University of Technology. The unsteady load was generated by a ramp-type variation of porosity of the disc. A Reynolds Averaged Navier-Stokes (RANS) model, a Free Wake Vortex Ring (FWVR) model and a Vortex Tube Model (VTM) simulate the same transient load changes. The velocity field obtained from the experimental and numerical methods are compared with the engineering dynamic inflow models. Velocity comparison aft the disc between the experimental and numerical methods shows the numerical models of RANS and FWVR model are capable to predict the velocity transient behaviour during transient disc loading. Velocity comparison at the disc between the engineering models and the numerical methods further shows that the engineering models predict much faster velocity decay, which implies the need for more advanced or better tuned dynamic inflow models.

  19. A combined experimental-numerical approach for two-phase flow boiling in a minichannel

    Hożejowska Sylwia


    Full Text Available The paper addresses experimental and numerical modeling of the two-phase flows in an asymmetrically heated horizontal minichannel. Experimental measurements concerned flows of evaporating ethanol in a minichannel with rectangular cross section 1.8mm × 2 mm. In order to observe the flows, measuring system was designed and built. The system measured and recorded basic heat and flow parameters of flowing fluid, and the temperature of external surface of the heater by using infrared camera and recorded images of flow with high-speed camera. The second aim of the paper was to formulate appropriate flow boiling heat transfer model, which would minimises the use of experimentally determined constants. The procedure of calculating the temperature of the ethanol is coupled with concurrent process of determining the temperature distributions in the isolating foil and the heating surface. The two-dimensional temperature distributions in three subsequent domains were calculated with Trefftz method. Due to the Robin condition, heat transfer coefficient at the heating surface-ethanol interface was calculated based on the known temperature distributions of the foil and liquid. Additionally, the paper describes the relation between two sets of functions used in the calculation. Numerical calculations made by Trefftz method were performed with using experimental data.

  20. An engineering approach to business model experimentation – an online investment research startup case study

    Kijl, Björn; Boersma, Durk


    Every organization needs a viable business model. Strikingly, most of current literature is focused on business model design, whereas there is almost no attention for business model validation and implementation and related business model experimentation. The goal of the research as described in

  1. An engineering approach to business model experimentation – an online investment research startup case study

    Kijl, Björn; Boersma, Durk


    Every organization needs a viable business model. Strikingly, most of current literature is focused on business model design, whereas there is almost no attention for business model validation and implementation and related business model experimentation. The goal of the research as described in thi

  2. Biofiltration of methyl tert-butyl ether vapors by cometabolism with pentane: modeling and experimental approach.

    Dupasquier, David; Revah, Sergio; Auria, Richard


    Degradation of methyl tert-butyl ether (MTBE) vapors by cometabolism with pentane using a culture of pentane-oxidizing bacteria (Pseudomonas aeruginosa) was studied in a 2.4-L biofilter packed with vermiculite, an inert mineral support. Experimental pentane elimination capacity (EC) of approximately 12 g m(-3) h(-1) was obtained for an empty bed residence time (EBRT) of 1.1 h and inlet concentration of 18.6 g m(-3). For these experimental conditions, EC of MTBE between 0.3 and 1.8 g m(-3) h(-1) were measured with inlet MTBE concentration ranging from 1.1 to 12.3 g m(-3). The process was modeled with general mass balance equations that consider a kinetic model describing cross-competitive inhibition between MTBE (cosubstrate) and pentane (substrate). The experimental data of pentane and MTBE removal efficiencies were compared to the theoretical predictions of the model. The predicted pentane and MTBE concentration profiles agreed with the experimental data for steady-state operation. Inhibition by MTBE of the pentane EC was demonstrated. Increasing the inlet pentane concentration improved the EC of MTBE but did not significantly change the EC of pentane. MTBE degradation rates obtained in this study were much lower than those using consortia or pure strains that can mineralize MTBE. Nevertheless, the system can be improved by increasing the active biomass.

  3. A Transverse Oscillation Approach for Estimation of Three-Dimensional Velocity Vectors, Part II: Experimental Validation

    Pihl, Michael Johannes; Stuart, Matthias Bo; Tomov, Borislav Gueorguiev


    The 3-D transverse oscillation method is investigated by estimating 3-D velocities in an experimental flowrigsystem. Measurements of the synthesized transverse oscillatingfields are presented as well. The method employs a 2-D transducer; decouples the velocity estimation; and estimates the axial,...

  4. A Musical Approach to Reading Fluency: An Experimental Study in First-Grade Classrooms

    Leguizamon, Daniel F.


    The purpose of this quantitative, quasi-experimental study was to investigate the relationship between Kodaly-based music instruction and reading fluency in first-grade classrooms. Reading fluency and overall reading achievement were measured for 109 participants at mid-point in the academic year pre- and post treatment. Tests were carried out to…

  5. Charge cluster distribution in nanosites traversed by a single ionizing particle An experimental approach

    Pszona, S.; Bantsar, A.; Kula, J.


    A method for modeling charge cluster formation by a single ionizing particle in nanoelectronic structures of few nanometres size is presented. The method is based on experimental modeling of charge formation in the equivalent gaseous nanosites irradiated by single charged particles and the subsequent scaling procedure to a needed medium. Propane irradiated by alpha particles is presented as an example.

  6. [Antoine Deidier, his experimental approach to the contagious nature of plague in Marseille in 1720].

    Dutour, Olivier


    Born in 1670 Deidier became a medical doctor at the age of 21. The son-in-law, of Vieussens, he took care of the inhabitants of Marseilles during the plague of 1720. A contagionist and an experimenter, he was considered as a strange scientist by his colleagues. It is time now to rehabilitate his memory.

  7. A statistical approach to the experimental design of the sulfuric acid leaching of gold-copper ore

    Mendes F.D.


    Full Text Available The high grade of copper in the Igarapé Bahia (Brazil gold-copper ore prevents the direct application of the classic cyanidation process. Copper oxides and sulfides react with cyanides in solution, causing a high consumption of leach reagent and thereby raising processing costs and decreasing recovery of gold. Studies have showm that a feasible route for this ore would be a pretreatment for copper minerals removal prior to the cyanidation stage. The goal of this experimental work was to study the experimental conditions required for copper removal from Igarapé Bahia gold-copper ore by sulfuric acid leaching by applying a statistical approach to the experimental design. By using the Plackett Burman method, it was possible to select the variables that had the largest influence on the percentage of copper extracted at the sulfuric acid leaching stage. These were temperature of leach solution, stirring speed, concentration of sulfuric acid in the leach solution and particle size of the ore. The influence of the individual effects of these variables and their interactions on the experimental response were analyzed by applying the replicated full factorial design method. Finally, the selected variables were optimized by the ascending path statistical method, which determined the best experimental conditions for leaching to achieve the highest percentage of copper extracted. Using the optimized conditions, the best leaching results showed a copper extraction of 75.5%.

  8. Intranasally administered oxytocin affects how dogs (Canis familiaris) react to the threatening approach of their owner and an unfamiliar experimenter.

    Hernádi, Anna; Kis, Anna; Kanizsár, Orsolya; Tóth, Katinka; Miklósi, Bernadett; Topál, József


    Fear and aggression are among the most prominent behavioural problems in dogs. Oxytocin has been shown to play a role in regulating social behaviours in humans including fear and aggression. As intranasal oxytocin has been found to have some analogous effects in dogs and humans, here we investigated the effect of oxytocin on dogs' behaviour in the Threatening Approach Test. Dogs, after having received intranasal administration of oxytocin (OT) or placebo (PL), showed the same reaction to an unfamiliar experimenter, but OT pretreated dogs showed a less friendly first reaction compared to the PL group when the owner was approaching. Individual differences in aggression (measured via questionnaire) also modulated dogs' first reaction. Moreover, subjects that received OT looked back more at the human (owner/experimenter) standing behind them during the threatening approach. These results suggest that oxytocin has an effect on dogs' response to the threatening cues of a human, but this effect is in interaction with other factors such as the identity of the approaching human and the 'baseline' aggression of the dogs.

  9. Nuclear weapon reliability evaluation methodology

    Wright, D.L. [Sandia National Labs., Albuquerque, NM (United States)


    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  10. Investigation on Experimental Simulation System for Mechanism Motion Reliability Based on the Multibody Simulation Model%基于多体仿真模型的运动机构可靠性仿真试验系统研究

    王慧; 宋笔锋; 喻天翔


    Based on the Monte Carlo simulation, a hybrid experimental simulation system for mechanism motion reliability is developed by using the LMS Virtual. Lab and the Visual Basic . NET. In the hybrid simulative experiment system,the LMS Virtual. Lab is used to perform the kinematic and dynamic analysis; the Visual Basic .NET is used to develop the user interface and huild the uncertainty models used for reliability analysis, as well as cxtract the useful result data from the output file of LMS Virtual. Lab. Finally, two examples, i. e. the mechanism motion reliability analysis of a crank-slider mechanism in which geometry errors and joint clearance errors are considered , and the multi-failure mode reliability analysis and sensitivity analvsis of a lock mechanism in which the effect of dynamic characteristics on the reliability are considered, are adopted to illustrate feasibility and efficiency of the hybrid simulation system.%以蒙特卡罗法为理论核心,利用多体运动学和动力学分析软件--LMS Virtual.Lab作为运动机构可靠性分析平台,通过Visual Basic.NET程序设计语言建立用户界面并实现对LMS Virtual.Lab的调用,以进行运动机构的可靠性仿真试验,采用得到的随机变量数据库和结果数据库对运动机构进行典型失效模式的可靠性分析,从而形成了一套可以实现运动机构的可靠性仿真与分析功能的可靠性仿真试验系统.以对心曲柄滑块机构为例,进行了几何尺寸误差和运动副间隙误差影响下的机构运动精度可靠性分析;以某锁机构为例,研究多失效模式下动力学特性对于机构定位可靠性的影响并进行了灵敏性分析.通过这两个实例验证该可靠性仿真试验系统进行机构可靠性仿真试验和可靠性分析的可行性和有效性.

  11. An experimental detrending approach to attributing change of pan evaporation in comparison with the traditional partial differential method

    Wang, Tingting; Sun, Fubao; Xia, Jun; Liu, Wenbin; Sang, Yanfang


    In predicting how droughts and hydrological cycles would change in a warming climate, change of atmospheric evaporative demand measured by pan evaporation (Epan) is one crucial element to be understood. Over the last decade, the derived partial differential (PD) form of the PenPan equation is a prevailing attribution approach to attributing changes to Epan worldwide. However, the independency among climatic variables required by the PD approach cannot be met using long term observations. Here we designed a series of numerical experiments to attribute changes of Epan over China by detrending each climatic variable, i.e., an experimental detrending approach, to address the inter-correlation among climate variables, and made comparison with the traditional PD method. The results show that the detrending approach is superior not only to a complicate system with multi-variables and mixing algorithm like aerodynamic component (Ep,A) and Epan, but also to a simple case like radiative component (Ep,R), when compared with traditional PD method. The major reason for this is the strong and significant inter-correlation of input meteorological forcing. Very similar and fine attributing results have been achieved based on detrending approach and PD method after eliminating the inter-correlation of input through a randomize approach. The contribution of Rh and Ta in net radiation and thus Ep,R, which has been overlooked based on the PD method but successfully detected by detrending approach, provides some explanation to the comparing results. We adopted the control run from the detrending approach and applied it to made adjustment of PD method. Much improvement has been made and thus proven this adjustment an effective way in attributing changes to Epan. Hence, the detrending approach and the adjusted PD method are well recommended in attributing changes in hydrological models to better understand and predict water and energy cycle.

  12. Evaluating Approaches to Teaching and Learning Chinese Vocabulary from the Learning Theories Perspective: An Experimental Case Study

    Katja SIMONČIČ


    Full Text Available With Chinese language gaining more and more popularity among Slovenian students and with the growing numbers of learners of Chinese as a foreign language in Slovenia and elsewhere it is crucial to find an approach that will lead to high quality and long-term knowledge of Chinese and that will motivate learners to continue learning. We can speak of two basic approaches to teaching Chinese vocabulary: the approach that first introduces pronunciation and the approach that simultaneously introduces pronunciation and character. The key question that arises is which of the two approaches leads to high quality and long-term knowledge? To answer the question an experimental case study was carried out at Ljubljana’s Faculty of Arts in the academic year 2011/2012. The case study showed that the approach that simultaneously introduces pronunciation and character and is based on the key principles of constructivist learning theory had beneficial effects on the students in terms of motivation and quality of knowledge of Chinese vocabulary.

  13. Compound-nuclear reactions with unstable nuclei: Constraining theory through innovative experimental approaches

    Escher J. E.


    Full Text Available Cross sections for compound-nuclear reactions involving unstable targets are important for many applications, but can often not be measured directly. Several indirect methods have recently been proposed to determine neutron capture cross sections for unstable isotopes. We consider three approaches that aim at constraining statistical calculations of capture cross sections with data obtained from the decay of the compound nucleus relevant to the desired reaction. Each method produces this compound nucleus in a different manner (via a light-ion reaction, a photon-induced reaction, or β-decay and requires additional ingredients to yield the sought-after cross section. We give a brief outline of the approaches and employ preliminary results from recent measurements to illustrate the methods. We discuss the main advantages and challenges of each approach.

  14. Impacts of radiation exposure on the experimental microbial ecosystem: a particle-based model simulation approach

    Doi, M.; Tanaka, N.; Fuma, S.; Kawabata, Z.


    Well-designed experimental model ecosystem could be a simple reference of the actual environment and complex ecological systems. For ecological toxicity test of radiation and other environmental toxicants, we investigated and aquatic microbial ecosystem (closed microcosm) in the test tube with initial substrates,autotroph flagellate algae (Euglena, G.), heterotroph ciliate protozoa (Tetrahymena T.) and saprotroph bacteria (E, coli). These species organizes by itself to construct the ecological system, that keeps the sustainable population dynamics for more than 2 years after inoculation only by adding light diurnally and controlling temperature at 25 degree Celsius. Objective of the study is to develop the particle-based computer simulation by reviewing interactions among microbes and environment, and analyze the ecological toxicities of radiation on the microcosm by replicating experimental results in the computer simulation. (Author) 14 refs.

  15. Exploring SiSn as a performance enhancing semiconductor: A theoretical and experimental approach

    Hussain, Aftab M.


    We present a novel semiconducting alloy, silicon-tin (SiSn), as channel material for complementary metal oxide semiconductor (CMOS) circuit applications. The material has been studied theoretically using first principles analysis as well as experimentally by fabricating MOSFETs. Our study suggests that the alloy offers interesting possibilities in the realm of silicon band gap tuning. We have explored diffusion of tin (Sn) into the industry\\'s most widely used substrate, silicon (100), as it is the most cost effective, scalable and CMOS compatible way of obtaining SiSn. Our theoretical model predicts a higher mobility for p-channel SiSn MOSFETs, due to a lower effective mass of the holes, which has been experimentally validated using the fabricated MOSFETs. We report an increase of 13.6% in the average field effect hole mobility for SiSn devices compared to silicon control devices.

  16. Fatigue study on the actuation performance of macro fiber composite (MFC): theoretical and experimental approach

    Pandey, Akash; Arockiarajan, A.


    Macro fiber composite (MFC) is extensively used in vibration control and actuation applications due to its high flexibility and enhanced coupling coefficients. During these applications, MFCs are subjected to the continuous cyclic electrical loading, which may lead to the degradation in its actuation performance. In order to predict the life cycle of MFCs, an experimental setup has been devised and experiments are performed under cyclic loading condition. Efforts involved in the experiments are huge in terms of time and cost. Hence, an attempt has been made to develop a theoretical model to predict the fatigue behavior of MFCs. A nonlinear finite element method has been formulated based on Kirchhoff plate theory wherein the fatigue failure criterion based on strain energy is embedded. Simulated results based on the proposed model is compared with experimental observation and are in good agreement with each other. Variation in the life cycle of MFCs are also studied for different operating temperatures as well as structural/geometric configurations.

  17. Therapeutic approach by Aloe vera in experimental model of multiple sclerosis.

    Mirshafiey, A; Aghily, B; Namaki, S; Razavi, A; Ghazavi, A; Ekhtiari, P; Mosayebi, G


    Multiple sclerosis (MS) is an autoimmune disease of the central nervous system (CNS) that leads to an inflammatory demyelination, axonal damage, and progressive neurologic disability that affects approximately 2.5 million people worldwide. The aim of the present research was to test the therapeutic effect of Aloe vera in experimental model of MS. All experiments were conducted on C57BL/6 male mice aged 6-8 weeks. To induce the experimental autoimmune encephalomyelitis (EAE), 250 microg of the myelin oligodendrocyte glycoprotein 35-55 peptide emulsified in complete freund's adjuvant was injected subcutaneously on day 0 over two flank areas. In addition, 200 ng of pertussis toxin in 100 microL phosphate buffered saline was injected intraperitoneally on days 0 and 2. The therapeutic protocol was carried out intragastrically using 120 mg/kg/day Aloe vera from 7 days before to 21 days after EAE induction. The mice were killed 21 days after EAE induction. The brains of mice were removed for histological analysis and their isolated splenocytes were cultured. The results indicated that treatment with Aloe vera caused a significant reduction in severity of the disease in experimental model of MS. Histological analysis showed 3 +/- 2 plaques in Aloe vera-treated mice compared with 5 +/- 1 plaques in control group. The density of mononuclear infiltration in the CNS of Aloe vera-treated mice (500 +/- 200) was significantly less in comparison to 700 +/- 185 cells in control group. Moreover, the serum level of nitric oxide in treatment group was significantly less than control animals. The level of interferon-gamma in cell culture supernatant of treated mice splenocytes was lower than control group, whereas decrease in serum level of interleukin-10 in treatment group was not significant in comparison with control mice. These data indicate that Aloe vera therapy can attenuate the disease progression in experimental model of MS.

  18. Experimental estimation of mutation rates in a wheat population with a gene genealogy approach.

    Raquin, Anne-Laure; Depaulis, Frantz; Lambert, Amaury; Galic, Nathalie; Brabant, Philippe; Goldringer, Isabelle


    Microsatellite markers are extensively used to evaluate genetic diversity in natural or experimental evolving populations. Their high degree of polymorphism reflects their high mutation rates. Estimates of the mutation rates are therefore necessary when characterizing diversity in populations. As a complement to the classical experimental designs, we propose to use experimental populations, where the initial state is entirely known and some intermediate states have been thoroughly surveyed, thus providing a short timescale estimation together with a large number of cumulated meioses. In this article, we derived four original gene genealogy-based methods to assess mutation rates with limited bias due to relevant model assumptions incorporating the initial state, the number of new alleles, and the genetic effective population size. We studied the evolution of genetic diversity at 21 microsatellite markers, after 15 generations in an experimental wheat population. Compared to the parents, 23 new alleles were found in generation 15 at 9 of the 21 loci studied. We provide evidence that they arose by mutation. Corresponding estimates of the mutation rates ranged from 0 to 4.97 x 10(-3) per generation (i.e., year). Sequences of several alleles revealed that length polymorphism was only due to variation in the core of the microsatellite. Among different microsatellite characteristics, both the motif repeat number and an independent estimation of the Nei diversity were correlated with the novel diversity. Despite a reduced genetic effective size, global diversity at microsatellite markers increased in this population, suggesting that microsatellite diversity should be used with caution as an indicator in biodiversity conservation issues.

  19. The determination of resonance energy in conjugated benzenoids: A combined experimental and theoretical approach

    Schmalz, Thomas G.


    The conjugated circuits model of aromaticity in benzenoids is reconsidered. Values for the parameters R1, R2, and R3 are rederived from experimental enthalpies of formation. It is shown that the value of R3 depends on the shape as well as the size of a 14-cycle. R3 values are found to obey the relation R3pyrene > R3anthracene > R3phenanthrene.

  20. Experimental approaches for measuring pKa's in RNA and DNA.

    Thaplyal, Pallavi; Bevilacqua, Philip C


    RNA and DNA carry out diverse functions in biology including catalysis, splicing, gene regulation, and storage of genetic information. Interest has grown in understanding how nucleic acids perform such sophisticated functions given their limited molecular repertoire. RNA can fold into diverse shapes that often perturb pKa values and allow it to ionize appreciably under biological conditions, thereby extending its molecular diversity. The goal of this chapter is to enable experimental measurement of pKa's in RNA and DNA. A number of experimental methods for measuring pKa values in RNA and DNA have been developed over the last 10 years, including RNA cleavage kinetics; UV-, fluorescence-, and NMR-detected pH titrations; and Raman crystallography. We begin with general considerations for choosing a pKa assay and then describe experimental conditions, advantages, and disadvantages for these assays. Potential pitfalls in measuring a pKa are provided including the presence of apparent pKa's due to a kinetic pKa or coupled acid- and alkali-promoted RNA unfolding, as well as degradation of RNA, precipitation of metal hydroxides and poor baselines. Use of multiple data fitting procedures and the study of appropriate mutants are described as ways to avoid some of these pitfalls. Application of these experimental methods to RNA and DNA will increase the number of available nucleic acid pKa values in the literature, which should deepen insight into biology and provide benchmarks for pKa calculations. Future directions for measuring pKa's in nucleic acids are discussed.

  1. Scanning X-ray nanodiffraction: from the experimental approach towards spatially resolved scattering simulations

    Dubslaff, Martin; Hanke, Michael; Patommel, Jens; Hoppe, Robert; Schroer, Christian G.; Schöder, Sebastian; Burghammer, Manfred


    An enhancement on the method of X-ray diffraction simulations for applications using nanofocused hard X-ray beams is presented. We combine finite element method, kinematical scattering calculations, and a spot profile of the X-ray beam to simulate the diffraction of definite parts of semiconductor nanostructures. The spot profile could be acquired experimentally by X-ray ptychography. Simulation results are discussed and compared with corresponding X-ray nanodiffraction experiments on single ...

  2. Management of Systemic Sclerosis-Related Skin Disease: A Review of Existing and Experimental Therapeutic Approaches.

    Volkmann, Elizabeth R; Furst, Daniel E


    The skin is the most common organ system involved in patients with systemic sclerosis (SSc). Nearly all patients experience cutaneous symptoms, including sclerosis, Raynaud's phenomenon, digital ulcers, telangiectasias, and calcinosis. In addition to posing functional challenges, cutaneous symptoms are often a major cause of pain, psychological distress, and body image dissatisfaction. The present article reviews the main features of SSc-related cutaneous manifestations and highlights an evidence-based treatment approach for treating each manifestation. This article also describes novel treatment approaches and opportunities for further research in managing this important clinical dimension of SSc.

  3. Experimental support for an immunological approach to the search for life on other planets.

    Schweitzer, Mary Higby; Wittmeyer, Jennifer; Avci, Recep; Pincus, Seth


    We propose a three-phase approach to test for evidence of life in extraterrestrial samples. The approach capitalizes on the flexibility, sensitivity, and specificity of antibody-antigen interactions. Data are presented to support the first phase, in which various extraction protocols are compared for efficiency, and in which a preliminary suite of antibodies are tested against various antigens. The antigens and antibodies were chosen on the basis of criteria designed to optimize the detection of extraterrestrial biomarkers unique to living or once-living organisms.

  4. Interfacial modification to optimize stainless steel photoanode design for flexible dye sensitized solar cells: an experimental and numerical modeling approach

    Salehi Taleghani, Sara; Zamani Meymian, Mohammad Reza; Ameri, Mohsen


    In the present research, we report fabrication, experimental characterization and theoretical analysis of semi and full flexible dye sensitized solar cells (DSSCs) manufactured on the basis of bare and roughened stainless steel type 304 (SS304) substrates. The morphological, optical and electrical characterizations confirm the advantage of roughened SS304 over bare and even common transparent conducting oxides (TCOs). A significant enhancement of about 51% in power conversion efficiency is obtained for flexible device (5.51%) based on roughened SS304 substrate compared to the bare SS304. The effect of roughening the SS304 substrates on electrical transport characteristics is also investigated by means of numerical modeling with regard to metal-semiconductor and interfacial resistance arising from the metallic substrate and nanocrystalline semiconductor contact. The numerical modeling results provide a reliable theoretical backbone to be combined with experimental implications. It highlights the stronger effect of series resistance compared to schottky barrier in lowering the fill factor of the SS304-based DSSCs. The findings of the present study nominate roughened SS304 as a promising replacement for conventional DSSCs substrates as well as introducing a highly accurate modeling framework to design and diagnose treated metallic or non-metallic based DSSCs.

  5. Microbial Experimental Evolution as a Novel Research Approach in the Vibrionaceae and Squid-Vibrio Symbiosis

    William eSoto


    Full Text Available The Vibrionaceae are a genetically and metabolically diverse family living in aquatic habitats with a great propensity toward developing interactions with eukaryotic microbial and multicellular hosts (as either commensals, pathogens, and mutualists. The Vibrionaceae frequently possess a life history cycle where bacteria are attached to a host in one phase and then another where they are free from their host as either part of the bacterioplankton or adhered to a solid substrate such as marine sediment, riverbeds, lakebeds, or floating particulate debris. These two stages in their life history exert quite distinct and separate selection pressures. When bound to solid substrates or to host cells, the Vibrionaceae can also exist as complex biofilms. The association between bioluminescent Vibrio spp. and sepiolid squids (Cephalopoda: Sepiolidae is an experimentally tractable model to study bacteria and animal host interactions, since the symbionts and squid hosts can be maintained in the laboratory independently of one another. The bacteria can be grown in pure culture and the squid hosts raised gnotobiotically with sterile light organs. The partnership between free-living Vibrio symbionts and axenic squid hatchlings emerging from eggs must be renewed every generation of the cephalopod host. Thus, symbiotic bacteria and animal host can each be studied alone and together in union. Despite virtues provided by the Vibrionaceae and sepiolid squid-Vibrio symbiosis, these assets to evolutionary biology have yet to be fully utilized for microbial experimental evolution. Experimental evolution studies already completed are reviewed, along with exploratory topics for future study.

  6. Enthalpy-Entropy Compensation Effect in Chemical Kinetics and Experimental Errors: A Numerical Simulation Approach.

    Perez-Benito, Joaquin F; Mulero-Raichs, Mar


    Many kinetic studies concerning homologous reaction series report the existence of an activation enthalpy-entropy linear correlation (compensation plot), its slope being the temperature at which all the members of the series have the same rate constant (isokinetic temperature). Unfortunately, it has been demonstrated by statistical methods that the experimental errors associated with the activation enthalpy and entropy are mutually interdependent. Therefore, the possibility that some of those correlations might be caused by accidental errors has been explored by numerical simulations. As a result of this study, a computer program has been developed to evaluate the probability that experimental errors might lead to a linear compensation plot parting from an initial randomly scattered set of activation parameters (p-test). Application of this program to kinetic data for 100 homologous reaction series extracted from bibliographic sources has allowed concluding that most of the reported compensation plots can hardly be explained by the accumulation of experimental errors, thus requiring the existence of a previously existing, physically meaningful correlation.

  7. A strategic approach to physico-chemical analysis of bis (thiourea) lead chloride - A reliable semi-organic nonlinear optical crystal

    Rajagopalan, N. R.; Krishnamoorthy, P.; Jayamoorthy, K.


    Good quality crystals of bis thiourea lead chloride (BTLC) have been grown by slow evaporation method from aqueous solution. Orthorhombic structure and Pna21 space group of the crystals have been identified by single crystal X-ray diffraction. Studies on nucleation kinetics of grown BTLC has been carried out from which meta-stable zone width, induction period, free energy change, critical radius, critical number and growth rate have been calculated. The experimental values of interfacial surface energy for the crystal growth process have been compared with theoretical models. Ultra violet transmittance studies resulted in a high transmittance and wide band gap energy suggested the required optical transparency of the crystal. The second harmonic generation (SHG) and phase matching nature of the crystal have been justified by Kurtz-Perry method. The SHG nature of the crystal has been further attested by the higher values of theoretical hyper polarizability. The dielectric nature of the crystals at different temperatures with varying frequencies has been thoroughly studied. The activation energy values of the electrical process have been calculated from ac conductivity study. Solid state parameters including valence electron plasma energy, Penn gap, Fermi energy and polarisability have been unveiled by theoretical approach and correlated with the crystal's SHG efficiency. The values of hardness number, elastic stiffness constant, Meyer's Index, minimum level of indentation load, load dependent constant, fracture toughness, brittleness index and corrected hardness obtained from Vicker's hardness test clearly showed that the BTLC crystal has good mechanical stability required for NLO device fabrication.

  8. Integrating Formal and Functional Approaches to Language Teaching in French Immersion: An Experimental Study.

    Day, Elaine M.; Shapson, Stan M.


    Evaluated the effect on French language proficiency of an integrated formal, analytic and functional, communicative approach to second language teaching in the immersion classroom. Impetus for the study arises from previous research indicating that immersion children show persistent weaknesses in their grammatical skills despite the fluent,…

  9. Elements of a flexible approach for conceptual hydrological modeling: 2. Application and experimental insights

    Kavetski, D.; Fenicia, F.


    In this article's companion paper, flexible approaches for conceptual hydrological modeling at the catchment scale were motivated, and the SUPERFLEX framework, based on generic model components, was introduced. In this article, the SUPERFLEX framework and the “fixed structure” GR4H model (an hourly

  10. Experimental validation of the Higher-Order Theory approach for sandwich panels with flexible core materials

    Straalen, IJ.J. van


    During tthe 1990's the higher-order theory was developed by Frostig to enable detailed stress analyses of sandwich panel structures. To investigate the potentials of this approach experiments are performed on sandwich panels made of thin steel faces and mineral wool or polystyrene core material. A p

  11. Holistic Experimentation for Emergence: A Creative Approach to Postgraduate Entrepreneurship Education and Training

    Mitra, Jay


    This article explores the development of a comprehensive and systemic approach to entrepreneurship education at a research-intensive university in the United Kingdom. The exploration is based on two key conceptual challenges: (a) taking entrepreneurship to mean something more than new business creation and (b) differentiating between…

  12. Holistic Experimentation for Emergence: A Creative Approach to Postgraduate Entrepreneurship Education and Training

    Mitra, Jay


    This article explores the development of a comprehensive and systemic approach to entrepreneurship education at a research-intensive university in the United Kingdom. The exploration is based on two key conceptual challenges: (a) taking entrepreneurship to mean something more than new business creation and (b) differentiating between…

  13. The Bayesian approximation error approach for electrical impedance tomography—experimental results

    Nissinen, A.; Heikkinen, L. M.; Kaipio, J. P.


    Inverse problems can be characterized as problems that tolerate measurement and modelling errors poorly. While the measurement error issue has been widely considered as a solved problem, the modelling errors have remained largely untreated. The approximation and modelling errors can, however, be argued to dominate the measurement errors in most applications. There are several applications in which the temporal and memory requirements dictate that the computational complexity of the forward solver be radically reduced. For example, in process tomography the reconstructions have to be carried out typically in a few tens of milliseconds. Recently, a Bayesian approach for the treatment of approximation and modelling errors for inverse problems has been proposed. This approach has proven to work well in several classes of problems, but the approach has not been verified in any problem with real data. In this paper, we study two different types of modelling errors in the case of electrical impedance tomography: one related to model reduction and one concerning partially unknown geometry. We show that the approach is also feasible in practice and may facilitate the reduction of the computational complexity of the nonlinear EIT problem at least by an order of magnitude.

  14. New concepts, experimental approaches, and dereplication strategies for the discovery of novel phytoestrogens from natural sources.

    Michel, Thomas; Halabalaki, Maria; Skaltsounis, Alexios-Leandros


    Phytoestrogens constitute an attractive research topic due to their estrogenic profile and their biological involvement in woman's health. Therefore, numerous studies are currently performed in natural products chemistry area aiming at the discovery of novel phytoestrogens. The main classes of phytoestrogens are flavonoids (flavonols, flavanones), isoflavonoids (isoflavones, coumestans), lignans, stilbenoids as well as miscellaneous chemical groups abundant in several edible and/or medicinal plants, belonging mostly to the Leguminosae family. As for other bioactives, the detection of new structures and more potent plant-derived phytoestrogens typically follows the general approaches currently available in the natural product discovery process. Plant-based approaches selected from traditional medicine knowledge and bioguided concepts are routinely employed. However, these approaches are associated with serious disadvantages such as time-consuming, repeated, and labor intensive processes as well as lack of specificity and reproducibility. In recent years, the natural products chemistry became more technology-driven, and several different strategies have been developed. Structure-oriented procedures and miniaturized approaches employing advanced hyphenated analytical platforms have recently emerged. They facilitate significantly not only the discovery of novel phytoestrogens but also the dereplication procedure leading to the anticipation of major drawbacks in natural products discovery. In this review, apart from the traditional concepts followed in phytochemistry for the discovery of novel biologically active compounds, recent applications in the field of extraction, analysis, fractionation, and identification of phytoestrogens will be discussed. Moreover, specific methodologies combining identification of actives and biological evaluation in parallel, such as liquid chromatography-biochemical detection, frontal affinity chromatography-mass spectrometry and pulsed

  15. Scale-dependent mechanisms of habitat selection for a migratory passerine: an experimental approach

    Donovan, Therese M.; Cornell, Kerri L.


    Habitat selection theory predicts that individuals choose breeding habitats that maximize fitness returns on the basis of indirect environmental cues at multiple spatial scales. We performed a 3-year field experiment to evaluate five alternative hypotheses regarding whether individuals choose breeding territories in heterogeneous landscapes on the basis of (1) shrub cover within a site, (2) forest land-cover pattern surrounding a site, (3) conspecific song cues during prebreeding settlement periods, (4) a combination of these factors, and (5) interactions among these factors. We tested hypotheses with playbacks of conspecific song across a gradient of landscape pattern and shrub density and evaluated changes in territory occupancy patterns in a forest-nesting passerine, the Black-throated Blue Warbler (Dendroica caerulescens). Our results support the hypothesis that vegetation structure plays a primary role during presettlement periods in determining occupancy patterns in this species. Further, both occupancy rates and territory turnover were affected by an interaction between local shrub density and amount of forest in the surrounding landscape, but not by interactions between habitat cues and social cues. Although previous studies of this species in unfragmented landscapes found that social postbreeding song cues played a key role in determining territory settlement, our prebreeding playbacks were not associated with territory occupancy or turnover. Our results suggest that in heterogeneous landscapes during spring settlement, vegetation structure may be a more reliable signal of reproductive performance than the physical location of other individuals.

  16. The effect of oxygen tension on human articular chondrocyte matrix synthesis: integration of experimental and computational approaches.

    Li, S; Oreffo, R O C; Sengers, B G; Tare, R S


    Significant oxygen gradients occur within tissue engineered cartilaginous constructs. Although oxygen tension is an important limiting parameter in the development of new cartilage matrix, its precise role in matrix formation by chondrocytes remains controversial, primarily due to discrepancies in the experimental setup applied in different studies. In this study, the specific effects of oxygen tension on the synthesis of cartilaginous matrix by human articular chondrocytes were studied using a combined experimental-computational approach in a "scaffold-free" 3D pellet culture model. Key parameters including cellular oxygen uptake rate were determined experimentally and used in conjunction with a mathematical model to estimate oxygen tension profiles in 21-day cartilaginous pellets. A threshold oxygen tension (pO2 ≈ 8% atmospheric pressure) for human articular chondrocytes was estimated from these inferred oxygen profiles and histological analysis of pellet sections. Human articular chondrocytes that experienced oxygen tension below this threshold demonstrated enhanced proteoglycan deposition. Conversely, oxygen tension higher than the threshold favored collagen synthesis. This study has demonstrated a close relationship between oxygen tension and matrix synthesis by human articular chondrocytes in a "scaffold-free" 3D pellet culture model, providing valuable insight into the understanding and optimization of cartilage bioengineering approaches.

  17. Experimental Approaches to Understanding Surficial Processes on Mars: The Stony Brook Experience 2000-2016

    McLennan, S. M.; Dehouck, E.; Hurowitz, J.; Lindsley, D. H.; Schoonen, M. A.; Tosca, N. J.; Zhao, Y. Y. S.


    Starting with Pathfinder and Global Surveyor, recent missions to Mars have provided great opportunity for low-temperature experimental geochemistry investigations of the Martian sedimentary record by providing geochemical and mineralogical data that can be used as meaningful tests for experiments. These missions have documented a long-lived, complex and dynamic sedimentary rock cycle, including "source-to-sink" sedimentary systems and global paleoenvironmental transitions through time. We designed and constructed an experimental facility, beginning in 2000, specifically to evaluate surficial processes on Mars. Our experimental philosophy has been to (1) keep apparatus simple and flexible, and if feasible maintain sample access during experiments; (2) use starting materials (minerals, rocks) close to known Mars compositions (often requiring synthesis); (3) address sedimentary processes supported by geological investigations at Mars; (4) begin with experiments at standard conditions so they are best supported by thermodynamics; (5) support experiments with thermodynamic-kinetic-mass balance modeling in both design and interpretation, and by high quality chemical, mineralogical and textural lab analyses; (6) interpret results in the context of measurements made at Mars. Although eliciting much comment in proposal and manuscript reviews, we have not attempted to slavishly maintain "Mars conditions", doing so only to the degree required by variables being tested in any given experiments. Among the problems we have addressed are (1) Amazonian alteration of rock surfaces; (2) Noachian-Hesperian chemical weathering; (3) epithermal alteration of `evolved' igneous rocks; (4) mineral surface chemical reactivity from aeolian abrasion; (5) evaporation of mafic brines; (6) early diagenesis of sedimentary iron mineralogy; (7) trace element and halogen behavior during chemical weathering and diagenesis; (8) photochemical influences on halogen distribution and speciation; (9) post

  18. The Teaching of Enzymes: A Low Cost Experimental Approach with High School Students.



    Full Text Available The association of experimental methods with the traditional lecturing of Science themes is encouraged by several authors. The importance ofconducting experimental classes is undeniable, as practical activities motivate students to search for kno wledge. But this is not the reality in most publicschools in the inlands of Rio de Janeiro’s State, where several factors prevent teachers from using such didactic strategy.   The aim of this  work  was demonstrate  low -cost experimental activities, addressing the subject enzymes. The practice has been designed for being held in a common classroom. Lab glassware was replaced by alternative materials. Potato extract  (catalase source and commercial hydrogen peroxide were required. Concepts such as kinetic data  related to substrate concentration, time of reaction, pH andtemperature effects on enzymatic activity have been  explored with the students. Graphical representation of enzyme activity related to the above parameters has also been elaborated. The practice w as held in 10 schools distributed among 7 different cities. Questionnaires have been applied to the students before and after the practice and revealed that 86% of the students had never participated of a practical lesson and 49% were unaware of the functi on of enzymes. After practice, 68% of the students said that the enzymes catalyze biological reactions and 73% considered the practice important to their cognitive achievement. The evaluation of information obtained from students confirmed the lack of practical activities in public education, as well as the capacity that the practical activities have  to motivate the students in the knowledge formation process.

  19. Optical waveguiding and applied photonics technological aspects, experimental issue approaches and measurements

    Massaro, Alessandro


    Optoelectronics--technology based on applications light such as micro/nano quantum electronics, photonic devices, laser for measurements and detection--has become an important field of research. Many applications and physical problems concerning optoelectronics are analyzed in Optical Waveguiding and Applied Photonics.The book is organized in order to explain how to implement innovative sensors starting from basic physical principles. Applications such as cavity resonance, filtering, tactile sensors, robotic sensor, oil spill detection, small antennas and experimental setups using lasers are a

  20. N2C2M2 Experimentation and Validation: Understanding Its C2 Approaches and Implications


    the organization and execution of the experiments, specifically, Col. Fernando Freire , LtCol. António Flambó, LtCol. José Martins, LtCol. Paulo ...of ELICIT Experimental Data. Paper presented at the 13th ICCRTS, Seattle, USA, 2008. [13.] Manso, Marco, and Paulo Nunes. ELICIT and the Future C2...detailed mapping between C2 CRM variables and ELICIT refer to: MANSO, Marco, and Paulo NUNES. ELICIT and the Future C2: Theoretical Foundations for the

  1. Numerical and Experimental Approach for Identifying Elastic Parameters in Sandwich Plates

    Sergio Ferreira Bastos


    Full Text Available This article deals with the identification of elastic parameters (engineering constants in sandwich honeycomb orthotropic rectangular plates. A non-destructive method is introduced to identify the elastic parameters through the experimental measurements of natural frequencies of a plate undergoing free vibrations. Four elastic constant are identified. The estimation of the elastic parameter problem is solved by minimizing the differences between the measured and the calculated natural frequencies. The numerical method to calculate the natural frequencies involves the formulation of Rayleigh-Ritz using a series of characteristic orthogonal polynomials to properly model the free edge boundary conditions. The analysis of the results indicates the efficiency of the method.

  2. A systematic multiscale modeling and experimental approach to protect grain boundaries in magnesium alloys from corrosion

    Horstemeyer, Mark R. [Mississippi State Univ., Mississippi State, MS (United States); Chaudhuri, Santanu [Univ. of Illinois, Urbana-Champaign, IL (United States)


    A multiscale modeling Internal State Variable (ISV) constitutive model was developed that captures the fundamental structure-property relationships. The macroscale ISV model used lower length scale simulations (Butler-Volmer and Electronics Structures results) in order to inform the ISVs at the macroscale. The chemomechanical ISV model was calibrated and validated from experiments with magnesium (Mg) alloys that were investigated under corrosive environments coupled with experimental electrochemical studies. Because the ISV chemomechanical model is physically based, it can be used for other material systems to predict corrosion behavior. As such, others can use the chemomechanical model for analyzing corrosion effects on their designs.

  3. Radiation pressure forces on individual micron-size dust particles: a new experimental approach

    Krauss, Oliver [Institute for Planetology, University of Muenster, Wilhelm-Klemm-Str. 10, D-48149 Muenster (Germany)]. E-mail:; Wurm, Gerhard [Institute for Planetology, University of Muenster, Wilhelm-Klemm-Str. 10, D-48149 Muenster (Germany)


    We present a newly developed experimental setup for the measurement of radiation pressure forces on individual dust particles. The principle of measurement is to observe the momentum transfer from a high-power laser pulse to a particle that is levitated in a quadrupole trap. Microscopic observation of the particle motion provides information on the forces that act on the particle in the directions parallel and perpendicular to the incident laser beam. First measurements with micron-size graphite grains that serve as analog particles for carbonaceous dust grains in various astrophysical environments reveal that such highly irregularly shaped particles show very high ratios of transversal to radial radiation pressure forces.

  4. Direct Determination of Absolute Configuration of Methyl-Substituted Phenyloxiranes: A Combined Experimental and Theoretical Approach

    Fristrup, Peter; Lassen, Peter Rygaard; Johannessen, Christian;


    obtained from quantum mechanical calculations (density functional theory with the B3LYP hybrid exchange correlation functional with 6-31++G**, aug-cc-pVDZ, or aug-cc-pVTZ basis set) and related to the physical structure of the compounds. The absolute configuration could be established directly in each case...... by comparing experimental and theoretical spectra. In addition, we have been able to document the changes that occur both in structures and in the VA and VCD spectra due to substituent effects on the oxirane ring....

  5. Investigation of wing crack formation with a combined phase-field and experimental approach

    Lee, Sanghyun; Reber, Jacqueline E.; Hayman, Nicholas W.; Wheeler, Mary F.


    Fractures that propagate off of weak slip planes are known as wing cracks and often play important roles in both tectonic deformation and fluid flow across reservoir seals. Previous numerical models have produced the basic kinematics of wing crack openings but generally have not been able to capture fracture geometries seen in nature. Here we present both a phase-field modeling approach and a physical experiment using gelatin for a wing crack formation. By treating the fracture surfaces as diffusive zones instead of as discontinuities, the phase-field model does not require consideration of unpredictable rock properties or stress inhomogeneities around crack tips. It is shown by benchmarking the models with physical experiments that the numerical assumptions in the phase-field approach do not affect the final model predictions of wing crack nucleation and growth. With this study, we demonstrate that it is feasible to implement the formation of wing cracks in large scale phase-field reservoir models.

  6. The trigonometric responder approach: a new method for detecting responders to pharmacological or experimental challenges.

    Reuter, M; Siegmund, A; Netter, P


    The paper presents a newly developed response measure that is particularly suitable for the evaluation of pharmacokinetic data. This method is based on trigonometric considerations, defining a hormone response as the difference between the angle of the slope of the curve before and after drug intake. In addition, the size of this difference is compared to the difference obtained in placebo conditions. In this way, the trigonometric response measure overcomes one of the most problematic shortcomings of the 'area under the curve' (AUC) approach, the problem of the initial value. We will present the mathematical background of the trigonometric method and demonstrate its usefulness by evaluating empirical data (a pharmacological challenge test using the dopamine agonist lisuride) and comparing it to classical AUC measures. This has been achieved by contrasting both approaches with responder definitions according to binary time series analysis and the peak value of the curve.

  7. Characterization of epitopes recognized by monoclonal antibodies: experimental approaches supported by freely accessible bioinformatic tools.

    Clementi, Nicola; Mancini, Nicasio; Castelli, Matteo; Clementi, Massimo; Burioni, Roberto


    Monoclonal antibodies (mAbs) have been used successfully both in research and for clinical purposes. The possible use of protective mAbs directed against different microbial pathogens is currently being considered. The fine definition of the epitope recognized by a protective mAb is an important aspect to be considered for possible development in epitope-based vaccinology. The most accurate approach to this is the X-ray resolution of mAb/antigen crystal complex. Unfortunately, this approach is not always feasible. Under this perspective, several surrogate epitope mapping strategies based on the use of bioinformatics have been developed. In this article, we review the most common, freely accessible, bioinformatic tools used for epitope characterization and provide some basic examples of molecular visualization, editing and computational analysis.

  8. Combining computer algorithms with experimental approaches permits the rapid and accurate identification of T cell epitopes from defined antigens.

    Schirle, M; Weinschenk, T; Stevanović, S


    The identification of T cell epitopes from immunologically relevant antigens remains a critical step in the development of vaccines and methods for monitoring of T cell responses. This review presents an overview of strategies that employ computer algorithms for the selection of candidate peptides from defined proteins and subsequent verification of their in vivo relevance by experimental approaches. Several computer algorithms are currently being used for epitope prediction of various major histocompatibility complex (MHC) class I and II molecules, based either on the analysis of natural MHC ligands or on the binding properties of synthetic peptides. Moreover, the analysis of proteasomal digests of peptides and whole proteins has led to the development of algorithms for the prediction of proteasomal cleavages. In order to verify the generation of the predicted peptides during antigen processing in vivo as well as their immunogenic potential, several experimental approaches have been pursued in the recent past. Mass spectrometry-based bioanalytical approaches have been used specifically to detect predicted peptides among isolated natural ligands. Other strategies employ various methods for the stimulation of primary T cell responses against the predicted peptides and subsequent testing of the recognition pattern towards target cells that express the antigen.

  9. Frontiers of reliability

    Basu, Asit P; Basu, Sujit K


    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  10. Modelling the experimental electron density: only the synergy of various approaches can tackle the new challenges.

    Macchi, Piero; Gillet, Jean-Michel; Taulelle, Francis; Campo, Javier; Claiser, Nicolas; Lecomte, Claude


    Electron density is a fundamental quantity that enables understanding of the chemical bonding in a molecule or in a solid and the chemical/physical property of a material. Because electrons have a charge and a spin, two kinds of electron densities are available. Moreover, because electron distribution can be described in momentum or in position space, charge and spin density have two definitions and they can be observed through Bragg (for the position space) or Compton (for the momentum space) diffraction experiments, using X-rays (charge density) or polarized neutrons (spin density). In recent years, we have witnessed many advances in this field, stimulated by the increased power of experimental techniques. However, an accurate modelling is still necessary to determine the desired functions from the acquired data. The improved accuracy of measurements and the possibility to combine information from different experimental techniques require even more flexibility of the models. In this short review, we analyse some of the most important topics that have emerged in the recent literature, especially the most thought-provoking at the recent IUCr general meeting in Montreal.

  11. Modelling the experimental electron density: only the synergy of various approaches can tackle the new challenges

    Piero Macchi


    Full Text Available Electron density is a fundamental quantity that enables understanding of the chemical bonding in a molecule or in a solid and the chemical/physical property of a material. Because electrons have a charge and a spin, two kinds of electron densities are available. Moreover, because electron distribution can be described in momentum or in position space, charge and spin density have two definitions and they can be observed through Bragg (for the position space or Compton (for the momentum space diffraction experiments, using X-rays (charge density or polarized neutrons (spin density. In recent years, we have witnessed many advances in this field, stimulated by the increased power of experimental techniques. However, an accurate modelling is still necessary to determine the desired functions from the acquired data. The improved accuracy of measurements and the possibility to combine information from different experimental techniques require even more flexibility of the models. In this short review, we analyse some of the most important topics that have emerged in the recent literature, especially the most thought-provoking at the recent IUCr general meeting in Montreal.

  12. Advanced Techniques for Seismic Protection of Historical Buildings: Experimental and Numerical Approach

    Mazzolani, Federico M.


    The seismic protection of historical and monumental buildings, namely dating back from the ancient age up to the 20th Century, is being looked at with greater and greater interest, above all in the Euro-Mediterranean area, its cultural heritage being strongly susceptible to undergo severe damage or even collapse due to earthquake. The cultural importance of historical and monumental constructions limits, in many cases, the possibility to upgrade them from the seismic point of view, due to the fear of using intervention techniques which could have detrimental effects on their cultural value. Consequently, a great interest is growing in the development of sustainable methodologies for the use of Reversible Mixed Technologies (RMTs) in the seismic protection of the existing constructions. RMTs, in fact, are conceived for exploiting the peculiarities of innovative materials and special devices, and they allow ease of removal when necessary. This paper deals with the experimental and numerical studies, framed within the EC PROHITECH research project, on the application of RMTs to the historical and monumental constructions mainly belonging to the cultural heritage of the Euro-Mediterranean area. The experimental tests and the numerical analyses are carried out at five different levels, namely full scale models, large scale models, sub-systems, devices, materials and elements.

  13. Grafted natural polymer as new drag reducing agent: An experimental approach

    Abdulbari Hayder A.


    Full Text Available The present investigation introduces a new natural drag reducing agent which has the ability to improve the flow in pipelines carrying aqueous or hydrocarbon liquids in turbulent flow. Okra (Abelmoschus esculentus mucilage drag reduction performance was tested in water and hydrocarbon (gas-oil media after grafting. The drag reduction test was conducted in a buildup closed loop liquid circulation system consists of two pipes 0.0127 and 0.0381 m Inside Diameter (ID, four testing sections in each pipe (0.5 to 2.0 m, tank, pump and pressure transmitters. Reynolds number (Re, additive concentration and the transported media type (water and gas-oil, were the major drag reduction variables investigated. The experimental results show that, new additive drag reduction ability is high with maximum percentage of drag reduction (%Dr up to 60% was achieved. The experimental results showed that the drag reduction ability increased by increasing the additive concentration. The %Dr was found to increase by increasing the Re by using the water-soluble additive while it was found to decrease by increasing the Re when using the oil-soluble additive. The %Dr was higher in the 0.0381 m ID pipe. Finally, the grafted and natural mucilage showed high resistance to shear forces when circulated continuously for 200 seconds in the closed-loop system.

  14. The dream as a model for psychosis: an experimental approach using bizarreness as a cognitive marker.

    Scarone, Silvio; Manzone, Maria Laura; Gambini, Orsola; Kantzas, Ilde; Limosani, Ivan; D'Agostino, Armando; Hobson, J Allan


    Many previous observers have reported some qualitative similarities between the normal mental state of dreaming and the abnormal mental state of psychosis. Recent psychological, tomographic, electrophysiological, and neurochemical data appear to confirm the functional similarities between these 2 states. In this study, the hypothesis of the dreaming brain as a neurobiological model for psychosis was tested by focusing on cognitive bizarreness, a distinctive property of the dreaming mental state defined by discontinuities and incongruities in the dream plot, thoughts, and feelings. Cognitive bizarreness was measured in written reports of dreams and in verbal reports of waking fantasies in 30 schizophrenics and 30 normal controls. Seven pictures of the Thematic Apperception Test (TAT) were administered as a stimulus to elicit waking fantasies, and all participating subjects were asked to record their dreams upon awakening. A total of 420 waking fantasies plus 244 dream reports were collected to quantify the bizarreness features in the dream and waking state of both subject groups. Two-way analysis of covariance for repeated measures showed that cognitive bizarreness was significantly lower in the TAT stories of normal subjects than in those of schizophrenics and in the dream reports of both groups. The differences between the 2 groups indicated that, under experimental conditions, the waking cognition of schizophrenic subjects shares a common degree of formal cognitive bizarreness with the dream reports of both normal controls and schizophrenics. Though very preliminary, these results support the hypothesis that the dreaming brain could be a useful experimental model for psychosis.

  15. Elastic properties of RCC under flexural loading-experimental and analytical approach

    S K Kulkarni; M R Shiyekar; S M Shiyekar; B Wagh


    In structural analysis,especially in indeterminate structures, it becomes essential to know material and geometrical properties of members. The codal provisions recommend elastic properties of concrete and steel and these are fairly accurate enough. The stress–strain curve for concrete cylinder or a cube specimen is plotted. The slope of this curve is modulus of elasticity of plain concrete. Another method of determining modulus of elasticity of concrete is by flexural test of a beam specimen. The modulus of elasticity most commonly used for concrete is secant modulus. The modulus of elasticity of steel is obtained by performing a tension test of steel bar. While performing analysis by any software for high rise building, cross area of plain concrete is taken into consideration whereas effects of reinforcement bars and concrete confined by stirrups are neglected. The aim of study is to determine elastic properties of reinforced cement concrete material. Two important stiffness properties such as AE and EI play important role in analysis of high rise RCC building idealized as plane frame. The experimental programme consists of testing of beams (model size 150 × 150 × 700 mm) with percentage of reinforcement varying from 0.54 to 1.63%. The experimental results are verified by using 3D finite element techniques. This study refers to the effect of variation of percentage of main longitudinal reinforcement and concrete grade. Effect of confinement is not considered and it appears in a separate study.

  16. Computational and experimental approaches for investigating nanoparticle-based drug delivery systems.

    Ramezanpour, M; Leung, S S W; Delgado-Magnero, K H; Bashe, B Y M; Thewalt, J; Tieleman, D P


    Most therapeutic agents suffer from poor solubility, rapid clearance from the blood stream, a lack of targeting, and often poor translocation ability across cell membranes. Drug/gene delivery systems (DDSs) are capable of overcoming some of these barriers to enhance delivery of drugs to their right place of action, e.g. inside cancer cells. In this review, we focus on nanoparticles as DDSs. Complementary experimental and computational studies have enhanced our understanding of the mechanism of action of nanocarriers and their underlying interactions with drugs, biomembranes and other biological molecules. We review key biophysical aspects of DDSs and discuss how computer modeling can assist in rational design of DDSs with improved and optimized properties. We summarize commonly used experimental techniques for the study of DDSs. Then we review computational studies for several major categories of nanocarriers, including dendrimers and dendrons, polymer-, peptide-, nucleic acid-, lipid-, and carbon-based DDSs, and gold nanoparticles. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov.

  17. Vibrational spectroscopy and aromaticity investigation of squarate salts: A theoretical and experimental approach

    Georgopoulos, Stéfanos L.; Diniz, Renata; Yoshida, Maria I.; Speziali, Nivaldo L.; Santos, Hélio F. Dos; Junqueira, Geórgia Maria A.; de Oliveira, Luiz F. C.


    Experimental and theoretical investigations of squarate salts [M 2(C 4O 4)] (M=Li, Na, K and Rb) were performed aiming to correlate the structures, vibrational analysis and aromaticity. Powder X-ray diffraction data show that these compounds are not isostructural, indicating that the metal-squarate and hydrogen bonds to water molecules interactions play a significant role on the the crystal packing. The infrared and Raman assigments suggest an equalization of the C-C bond lengths with the increasing of the counter-ion size. This result is interpreted as an enhancement in the electronic delocalization and consequently in the degree of aromaticity for salts with larger ions. Quantum mechanical calculations for structures, vibrational spectra and aromaticity index are in agreement with experimental finding, giving insights at molecular level for the role played by distinct complexation modes to the observed properties. Comparison between our results and literature, regarding molecular dynamics in different chemical environments, shows that aromaticity and hydrogen bonds are the most important forces driving the interactions in the solid structures of squarate ion.

  18. Application of Lean-Six Sigma Approach in a Laboratory Experimental Case Study

    Hashim Raza Rizvi


    Full Text Available Laboratory experiments are a conventional activity performed at academic institutions, government and private organizations. These experimental studies provide the basis for new inventions in the field of science and engineering. Laboratory experiments are conducted on the basis of provided guidelines, already established by different standard organizations like ASTM, AASHTO etc. This article is based on a case study in which the process of an experiment is examined on the basis of Value Stream Maps (VSM and potential improvement possibilities have been identified. After determining the potential waste, appropriate Lean tools are selected to implement and observe the improvements. The process is examined after application of the Lean tools and a comparison is performed. University laboratory environment can be improved considerably by applying Lean Tools. MUDA application reduced the total work time from 90.75 hours and 10-CD to 63.75 hours and 7-CD hence saving, 27 hours and 3-CD for one experiment. This is remarkable achievement of this application. Heijunka application provided the students equal workload and they performed explicitly better than they used to. 5-S tool provided the students the opportunity to manage the laboratory in an effective and clean way. Safety of the students is a very major concern at university laboratory environment. 5-S not only upgraded the laboratory overall performance, but it significantly raised the safety standards of the laboratory. More application of the Lean Tools should be exercised explored to have more effective and efficient university laboratory experimental environment.

  19. Timer-based mechanisms in reliable transport protocol connection management: a comparison of the TCP and Delta-t protocol approaches

    Watson, R.W.


    There is a need for timer-based mechanisms (other than retransmission timers) to achieve reliable connection management in transport protocols. This need is illustrated by comparing the timer mechanisms in the Department of Defense Transmission Control Protocol (TCP), initially designed using only a message exchange based mechanism, and in the Lawrence Livermore Laboratory Delta-t protocol, designed explicitly to be timer based. The bounding of maximum packet lifetime and related parameters for achieving transport protocol reliability are important, and a mechanism is outlined for enforcing such a bound.

  20. Experimental approach to domino-style basement fault systems with evaporites during extension and subsequent inversion

    Ferrer, Oriol; McClay, Ken


    Salt is mechanically weaker than other sedimentary rocks in rift basins. During extension it commonly acts as a strain localizer, decoupling supra- and sub-salt deformation. In this scenario the movement of the subsalt faults combined with the salt migration commonly constraint the development of syncline basins. The shape of these synclines is basically controlled by the thickness and strength of the overlying salt section, as well as by the shapes of the extensional faults, and the magnitudes and slip rates along the faults. The inherited extensional structure, and particularly the continuity of the salt section, plays a key role if the rift basin is subsequently inverted. This research utilizes scaled physical models to analyse the interplay between subsalt structures and suprasalt units during both extension and inversion in domino-style basement fault systems. The experimental program includes twelve analogue models to analyze how the thickness and stratigraphy of the salt unit as well as the thickness of the pre-extensional cover constraint the structural style during extension and subsequent inversion. Different models with the same setup have been used to examine the kinematic evolution. Model kinematics was documented and analyzed combining high-resolution photographs and sub-millimeter resolution scanners. The vertical sections carried out at the end of the experiments have been used to characterize the variations of the structures along strike using new methodologies (3D voxel models in image processing software and 3D seismic). The experimental results show that after extension, rift systems with salt affected by domino-style basement faults don't show the classical growth stratal wedges. In this case synclinal basins develop above the salt on the hangingwall of the basement faults. The evolution of supra- and subsalt deformation is initially decoupled by the salt layer. Salt migrates from the main depocenters towards the edges of the basin constraining

  1. Temperature evaluation by simultaneous emission and saturated fluorescence measurements: A critical theoretical and experimental appraisal of the approach

    Shelby, Daniel E., E-mail: [Department of Chemistry, University of Florida, Gainesville, FL 32611 (United States); Merk, Sven [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany); Smith, Benjamin W. [Department of Chemistry, University of Florida, Gainesville, FL 32611 (United States); Gornushkin, Igor B. [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany); Panne, Ulrich [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany); Department of Chemistry, Humboldt Universität, Berlin (Germany); Omenetto, Nicoló [Department of Chemistry, University of Florida, Gainesville, FL 32611 (United States)


    Temperature is one of the most important physical parameters of plasmas induced by a focused laser beam on solid targets, and its experimental evaluation has received considerable attention. An intriguing approach, first proposed by Kunze (H.-J. Kunze, Experimental check of local thermodynamic equilibrium in discharges, Appl. Opt., 25 (1986) 13–13.) as a check of the existence of local thermodynamic equilibrium, is based upon the simultaneous measurement of the thermal emission and the optically saturated fluorescence of the same selected atomic transition. The approach, whose appealing feature is that neither the calibration of the set-up nor the spontaneous radiative probability of the transitions is needed, has not yet been applied, to our knowledge, to analytical flames and plasmas. A critical discussion of the basic requirements for the application of the method, its advantages, and its experimental limitations, is therefore presented here. For our study, Ba{sup +} transitions in a plasma formed by focusing a pulsed Nd:YAG laser (1064 nm) on a glass sample containing BaO are selected. At various delay times from the plasma initiation, a pulsed, excimer-pumped dye laser tuned at the center of two Ba transitions (6s {sup 2}S{sub 1/2} → 6p {sup 2}P°{sub 3/2}; 455.403 nm and 6p {sup 2}P°{sub 1/2} → 6d {sup 2}S{sub 1/2}; 452.493 nm) is used to enhance the populations of the excited levels (6p {sup 2}P°{sub 3/2} and 6d {sup 2}S{sub 1/2}) above their thermal values. The measured ratio of the emission and direct line fluorescence signals observed at 614.171 nm (6p {sup 2}P°{sub 3/2} → 5d {sup 2}D{sub 5/2}) and 489.997 nm (6d {sup 2}S{sub 1/2} → 6p {sup 2}P°{sub 3/2}) is then related to the excitation temperature of the plasma. Our conclusion is that the approach, despite being indeed attractive and clever, does not seem to be easily applicable to flames and plasmas, in particular to transient and inhomogeneous plasmas such as those induced by lasers on

  2. Experimental Comparison of two Active Vibration Control Approaches: Velocity Feedback and Negative Capacitance Shunt Damping

    Beck, Benjamin; Schiller, Noah


    This paper outlines a direct, experimental comparison between two established active vibration control techniques. Active vibration control methods, many of which rely upon piezoelectric patches as actuators and/or sensors, have been widely studied, showing many advantages over passive techniques. However, few direct comparisons between different active vibration control methods have been made to determine the performance benefit of one method over another. For the comparison here, the first control method, velocity feedback, is implemented using four accelerometers that act as sensors along with an analog control circuit which drives a piezoelectric actuator. The second method, negative capacitance shunt damping, consists of a basic analog circuit which utilizes a single piezoelectric patch as both a sensor and actuator. Both of these control methods are implemented individually using the same piezoelectric actuator attached to a clamped Plexiglas window. To assess the performance of each control method, the spatially averaged velocity of the window is compared to an uncontrolled response.

  3. New approach for understanding experimental NMR relaxivity properties of magnetic nanoparticles: focus on cobalt ferrite.

    Rollet, Anne-Laure; Neveu, Sophie; Porion, Patrice; Dupuis, Vincent; Cherrak, Nadine; Levitz, Pierre


    Relaxivities r1 and r2 of cobalt ferrite magnetic nanoparticles (MNPs) have been investigated in the aim of improving the models of NMR relaxation induced by magnetic nanoparticles. On one hand a large set of relaxivity data has been collected for cobalt ferrite MNP dispersions. On the other hand the relaxivity has been calculated for dispersions of cobalt ferrite MNPs with size ranging from 5 to 13 nm, without using any fitting procedure. The model is based on the magnetic dipolar interaction between the magnetic moments of the MNPs and the (1)H nuclei. It takes into account both the longitudinal and transversal contributions of the magnetic moments of MNPs leading to three contributions in the relaxation equations. The comparison of the experimental and theoretical data shows a good agreement of the NMR profiles as well as the temperature dependence.

  4. Impact of drilling fluids on seagrasses: an experimental community approach (journal version)

    Morton, R.D.; Duke, T.W.; Macauley, J.M.; Clark, J.R.; Price, W.A.


    Effects of a used drilling fluid on an experimental seagrass community (Thalassia testudinum Konig et Sims) were measured by exposing the community to the suspended particulate phase (SPP) in laboratory microcosms. Structure of the macroinvertebrate assemblage, growth, and chlorophyll content of grass and associated epiphytes, and rates of decomposition as indicated by weight loss of grass leaves in treated and untreated microcosms were compared. There were statistically significant differences in community structure and function among untreated microcosms and those receiving the clay and drilling fluid. For example, drilling fluid and clay caused a significant decrease in the numbers of the ten most numerically abundant (dominant) macroinvertebrates, and drilling fluid decreased the rate at which Thalassia leaves decomposed.


    M. R. Monazzam, A. Nezafat


    Full Text Available Noise is one of the most serious challenges in modern community. In some specific industries, according to the nature of process, this challenge is more threatening. This paper describes a means of noise control for spinning machine based on experimental measurements. Also advantages and disadvantages of the control procedure are added. Different factors which may affect the performance of the barrier in this situation are also mentioned. To provide a good estimation of the control measure, a theoretical formula is also described and it is compared with the field data. Good agreement between the results of filed measurements and theoretical presented model was achieved. No obvious noise reduction was seen by partial indoor barriers in low absorbent enclosed spaces, since the reflection from multiple hard surfaces is the main dominated factor in the tested environment. At the end, the situation of the environment and standards, which are necessary in attaining the ideal results, are explained.

  6. Effects of organic enrichment on macrofauna community structure: an experimental approach

    Rodrigo Riera


    Full Text Available The determination of the resilience of benthic assemblages is a capital issue for the off-shore aquaculture industry in its attempts to minimize environmental disturbances. Experimental studies are an important tool for the establishment of thresholds for macrofaunal assemblages inhabiting sandy seabeds. An experiment was conducted with three treatments (Control, 1x and 3x,in which organic load (fish pellets was added (1x (10 g of fish pellets and 3x (30 g. A reduction in abundance of individuals and species richness was found as between the control and organic-enriched treatments. Significant changes in assemblage structure were also found, mainly due to the decrease of the sensitive tanaid Apseudes talpa in organically-enriched treatments. AMBI and M-AMBI indices were calculated and a decrease of ecological status was observed in treatment 3x.

  7. NLOphoric studies in thiazole containing symmetrical push-pull fluorophores - Combined experimental and DFT approach

    Telore, Rahul D.; Satam, Manjaree A.; Sekar, Nagaiyan


    A series of donor-π-acceptor extended styryl chromophores having carbazole as electron donor with thiazole bridge and cyano group as the electron acceptors are investigated for nonlinear optical properties. The geometries of the extended styryls were optimized and electronic excitation properties were estimated using TD-DFT. The experimental solvatochromic shifts and density functional theory (DFT) computations are employed to understand the nonlinear optical properties of new generation push-pull fluorophores containing carbazole and thiazole cores. The solvatochromic polarizability values are calculated by two-level quantum mechanical model. Density Functional Theory is explored to determine the liner hyperpolarizability (α0), static first hyperpolarizability (β0) and second order hyperpolarizability (γ) for extended styryls. The calculation carried out by B3LYP method and 6-31G (d) basis set at ground state and excited state in different solvent polarity. The thiazole bridge in extended styryl enhances the nonlinear optical properties.

  8. Gaussian process surrogates for failure detection: A Bayesian experimental design approach

    Wang, Hongqiao; Lin, Guang; Li, Jinglai


    An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.

  9. Sustainable Approaches for Stormwater Quality Improvements with Experimental Geothermal Paving Systems

    Kiran Tota-Maharaj


    Full Text Available This research assesses the next generation of permeable pavement systems (PPS incorporating ground source heat pumps (geothermal paving systems. Twelve experimental pilot-scaled pavement systems were assessed for its stormwater treatability in Edinburgh, UK. The relatively high variability of temperatures during the heating and cooling cycle of a ground source heat pump system embedded into the pavement structure did not allow the ecological risk of pathogenic microbial expansion and survival. Carbon dioxide monitoring indicated relatively high microbial activity on a geotextile layer and within the pavement structure. Anaerobic degradation processes were concentrated around the geotextile zone, where carbon dioxide concentrations reached up to 2000 ppm. The overall water treatment potential was high with up to 99% biochemical oxygen demand removal. The pervious pavement systems reduced the ecological risk of stormwater discharges and provided a low risk of pathogen growth.

  10. On the impact of non-Gaussian wind statistics on wind turbines - an experimental approach

    Schottler, Jannik; Reinke, Nico; Hoelling, Agnieszka; Whale, Jonathan; Peinke, Joachim; Hoelling, Michael


    The effect of intermittent and Gaussian inflow conditions on wind energy converters is studied experimentally. Two different flow situations were created in a wind tunnel using an active grid. Both flows exhibit nearly equal mean velocity values and turbulence intensities, but strongly differ in their two point uτ = u (t + τ) - u (t) on a variety of time scales τ, one being Gaussian distributed, the other one being strongly intermittent. A horizontal axis model wind turbine is exposed to both flows, isolating the effect of the differences not captured by mean values and turbulence intensities on the turbine. Thrust, torque and power data were recorded and analyzed, showing that the model turbine does not smooth out intermittency. Intermittent inflow is converted to similarly intermittent turbine data on all scales considered, reaching down to sub-rotor scales in space, indicating that it is not correct to assume a smoothing of wind speed fluctuations below the size of the rotor.

  11. Temporal and Spatio-Temporal Dynamic Instabilities: Novel Computational and Experimental approaches

    Doedel, Eusebius J.; Panayotaros, Panayotis; Lambruschini, Carlos L. Pando


    This special issue contains a concise account of significant research results presented at the international workshop on Advanced Computational and Experimental Techniques in Nonlinear Dynamics, which was held in Cusco, Peru in August 2015. The meeting gathered leading experts, as well as new researchers, who have contributed to different aspects of Nonlinear Dynamics. Particularly significant was the presence of many active scientists from Latin America. The topics covered in this special issue range from advanced numerical techniques to novel physical experiments, and reflect the present state of the art in several areas of Nonlinear Dynamics. It contains seven review articles, followed by twenty-one regular papers that are organized in five categories, namely (1) Nonlinear Evolution Equations and Applications, (2) Numerical Continuation in Self-sustained Oscillators, (3) Synchronization, Control and Data Analysis, (4) Hamiltonian Systems, and (5) Scaling Properties in Maps.


    Su Xihong; Liu Hongwei; Wu Zhibo; Yang Xiaozong; Zuo Decheng


    Reliability is one of the most critical properties of software system.System deployment architecture is the allocation of system software components on host nodes.Software Architecture (SA)based software deployment models help to analyze reliability of different deployments.Though many approaches for architecture-based reliability estimation exist,little work has incorporated the influence of system deployment and hardware resources into reliability estimation.There are many factors influencing system deployment.By translating the multi-dimension factors into degree matrix of component dependence,we provide the definition of component dependence and propose a method of calculating system reliability of deployments.Additionally,the parameters that influence the optimal deployment may change during system execution.The existing software deployment architecture may be ill-suited for the given environment,and the system needs to be redeployed to improve reliability.An approximate algorithm,A*_D,to increase system reliability is presented.When the number of components and host nodes is relative large,experimental results show that this algorithm can obtain better deployment than stochastic and greedy algorithms.

  13. Hydrogenation reactions in interstellar CO ice analogues. A combined experimental/theoretical approach

    Fuchs, G. W.; Cuppen, H. M.; Ioppolo, S.; Romanzin, C.; Bisschop, S. E.; Andersson, S.; van Dishoeck, E. F.; Linnartz, H.


    Context: Hydrogenation reactions of CO in inter- and circumstellar ices are regarded as an important starting point in the formation of more complex species. Previous laboratory measurements by two groups of the hydrogenation of CO ices provided controversial results about the formation rate of methanol. Aims: Our aim is to resolve this controversy by an independent investigation of the reaction scheme for a range of H-atom fluxes and different ice temperatures and thicknesses. To fully understand the laboratory data, the results are interpreted theoretically by means of continuous-time, random-walk Monte Carlo simulations. Methods: Reaction rates are determined by using a state-of-the-art ultra high vacuum experimental setup to bombard an interstellar CO ice analog with H atoms at room temperature. The reaction of CO + H into H2CO and subsequently CH3OH is monitored by a Fourier transform infrared spectrometer in a reflection absorption mode. In addition, after each completed measurement, a temperature programmed desorption experiment is performed to identify the produced species according to their mass spectra and to determine their abundance. Different H-atom fluxes, morphologies, and ice thicknesses are tested. The experimental results are interpreted using Monte Carlo simulations. This technique takes into account the layered structure of CO ice. Results: The formation of both formaldehyde and methanol via CO hydrogenation is confirmed at low temperature (T = 12{-}20 K). We confirm that the discrepancy between the two Japanese studies is caused mainly by a difference in the applied hydrogen atom flux, as proposed by Hidaka and coworkers. The production rate of formaldehyde is found to decrease and the penetration column to increase with temperature. Temperature-dependent reaction barriers and diffusion rates are inferred using a Monte Carlo physical chemical model. The model is extended to interstellar conditions to compare with observational H2CO/CH3OH data.

  14. Successional colonization of temporary streams: An experimental approach using aquatic insects

    Godoy, Bruno Spacek; Queiroz, Luciano Lopes; Lodi, Sara; Nascimento de Jesus, Jhonathan Diego; Oliveira, Leandro Gonçalves


    The metacommunity concept studies the processes that structure communities on local and regional scales. This concept is useful to assess spatial variability. However, temporal patterns (e.g., ecological succession and colonization) are neglected in metacommunity studies, since such patterns require temporally extensive, and hard to execute studies. We used experimental habitats in temporary streams located within the Brazilian Cerrado to evaluate the importance of succession for the aquatic insect metacommunity. Five artificial habitats consisting of wrapped crushed rock were set transversally to the water flow in five streams. The habitats were sampled weekly to assess community composition, and replaced after sampling to identify new potential colonizers. We analyzed the accumulation of new colonizers after each week using a logistic model. We selected pairs of experimental habitats and estimated the Bray-Curtis dissimilarity index to assess the community composition trajectory during the experiment. We used the dissimilarity values in ANOVA tests, identifying the importance of time and space for the community. The number of new taxa stabilized in the third week, and we estimated a weekly increase of 1.61 new taxa in the community after stabilization. The overall pattern was a small change on community composition, but one stream had a higher weekly turnover. Our results showed a relevant influence of time in the initial communities of aquatic insects of temporary streams. However, we must observe the temporal pattern in a spatial context, once different streams have different successional history regarding number of taxa and community turnover. We highlight the importance of aerial dispersal and movement to seek oviposition sites as an important factor in determining colonization patterns.

  15. Estimation of break-lock in PLL synthesizers for monopulse radar applications: Experimental and simulation approach

    Harikrishna Paik


    Full Text Available This work presents and estimates the break-lock in phase locked loop (PLL synthesizers for monopulse radar applications through experimental measurements and computer simulation. Sinusoidal continuous wave (CW and linear frequency modulated (LFM signals are used as repeater jamming signals. The CW jamming signal power as a function of radar echo signal power at break-lock is estimated for different values of frequency difference between these two signals, and from these results the jammer to echo signal power (J/S ratio (in dB is computed. Break-lock is achieved at a J/S ratio of 1.9 dB (measured at 1.8 dB for a typical echo signal power of −5 dBm with a 1 MHz frequency difference. The frequency deviation as a function of J/S ratio required to break-lock is estimated for different modulation rates in the presence of LFM jamming signal. Break-lock is achieved at a frequency deviation of 0.34 MHz (measured at 0.32 MHz for a J/S ratio of 2 dB and 200 kHz modulation rate. The simulation models are proposed accordingly to the data obtained from the experimental setups. Good and consistent agreements between the measured and simulated results are observed and can be useful in the design of CW and LFM jammers in the target platform.

  16. Factors influencing the extraction of pharmaceuticals from sewage sludge and soil: an experimental design approach.

    Ferhi, Sabrina; Bourdat-Deschamps, Marjolaine; Daudin, Jean-Jacques; Houot, Sabine; Nélieu, Sylvie


    Pharmaceuticals can enter the environment when organic waste products are recycled on agricultural soils. The extraction of pharmaceuticals is a challenging step in their analysis. The very different extraction conditions proposed in the literature make the choice of the right method for multi-residue analysis difficult. This study aimed at evaluating, with experimental design methodology, the influence of the nature, pH and composition of the extraction medium on the extraction recovery of 14 pharmaceuticals, including 8 antibiotics, from soil and sewage sludge. Preliminary experimental designs showed that acetonitrile and citrate-phosphate buffer were the best extractants. Then, a response surface design demonstrated that many cross-product and squared terms had significant effects, explaining the shapes of the response surfaces. It also allowed optimising the pharmaceutical recoveries in soil and sludge. The optimal conditions were interpreted considering the ionisation states of the compounds, their solubility in the extraction medium and their interactions with the solid matrix. To perform the analysis, a compromise was made for each matrix. After a QuEChERS purification, the samples were analysed by online SPE-UHPLC-MS-MS. Both methods were simple and economical. They were validated with the accuracy profile methodology for soil and sludge and characterised for another type of soil, digested sludge and composted sludge. Trueness globally ranged between 80 and 120 % recovery, and inter- and intra-day precisions were globally below 20 % relative standard deviation. Various pharmaceuticals were present in environmental samples, with concentration levels ranging from a few micrograms per kilogramme up to thousands of micrograms per kilogramme. Graphical abstract Influence of the extraction medium on the extraction recovery of 14 pharmaceuticals. Influence of the ionisation state, the solubility and the interactions of pharmaceuticals with solid matrix. Analysis

  17. Porcine liver: experimental model for the intra-hepatic glissonian approach Figado suíno: modelo experimental para o acesso glissoniano intra-hepático

    Antonio Cavalcanti de Albuquerque Martins


    Full Text Available PURPOSE: The aim of the study is to evaluate the porcine liver as a teaching and training model for the glissonian approach. METHODS: Ten livers were removed from domestic adult white pigs weighting 35 to 45kg. Based on anatomical landmarks, the glissonian pedicles of each liver segments were dissected and biopsies were taken for histological examination, to analyze the presence of the glissonian sheath. RESULTS: During microscopic examination, a sheath of conjunctive tissue was observed wrapping each segmental pedicle in porcine liver. This could be clearly seen when histological preparation for connective tissue was obtained (Masson technique. CONCLUSION: The morphological arrangement of glissonian pedicles in porcine liver makes this model a useful tool for training the intra-hepatic glissonian approach.OBJETIVO: Avaliar a utilização do fígado suíno como um modelo experimental para o ensino e treinamento da técnica glissoniana intra-hepática nas ressecções do fígado. MÉTODOS: Foram utilizados 10 fígados inteiros de porcos adultos brancos, pesando entre 35-45 kg. Os pedículos glissonianos de vários segmentos foram dissecados e ressecados para a realização de estudos histológicos e verificação da presença da bainha conjuntiva ao longo das tríades portais. RESULTADOS: Na microscopia, uma bainha de tecido conjuntivo foi encontrada envolvendo os pedículos glissonianos no fígado suíno. A utilização de preparações específicas para o tecido conjuntivo (Masson ressaltou a presença dessa bainha em cada pedículo. CONCLUSÃO: As características morfológicas dos pedículos glissonianos suínos, fazem desse modelo experimental um método de treinamento da técnica glissoniana intra-hepática.

  18. Phase equilibrium of liquid mixtures: Experimental and modeled data using statistical associating fluid theory for potential of variable range approach

    Giner, Beatriz; Bandrés, Isabel; Carmen López, M.; Lafuente, Carlos; Galindo, Amparo


    A study of the phase equilibrium (experimental and modeled) of mixtures formed by a cyclic ether and haloalkanes has been derived. Experimental data for the isothermal vapor liquid equilibrium of mixtures formed by tetrahydrofuran and tetrahydropyran and isomeric chlorobutanes at temperatures of 298.15, 313.15, and 328.15K are presented. Experimental results have been discussed in terms of both molecular characteristics of pure compounds and potential intermolecular interaction between them using thermodynamic information of the mixtures obtained earlier. The statistical associating fluid theory for potential of variable range (SAFT-VR) approach together with standard combining rules without adjustable parameters has been used to model the phase equilibrium. Good agreement between experiment and the prediction is found with such a model. Mean absolute deviations for pressures are of the order of 1kPa, while less than 0.013mole fraction for vapor phase compositions. In order to improve the results obtained, a new modeling has been carried out by introducing a unique transferable parameter kij, which modifies the strength of the dispersion interaction between unlike components in the mixtures, and is valid for all the studied mixtures being not temperature or pressure dependent. This parameter together with the SAFT-VR approach provides a description of the vapor-liquid equilibrium of the mixtures that is in excellent agreement with the experimental data for most cases. The absolute deviations are of the order of 0.005mole fraction for vapor phase compositions and less than 0.3kPa for pressure, excepting for mixtures containing 2-chloro-2-methylpropane which deviations for pressure are larger. Results obtained in this work in the modeling of the phase equilibrium with the SAFT-VR equation of state have been compared to the ones obtained in a previous study when the approach was used to model similar mixtures with clear differences in the thermodynamic behavior. We

  19. Experimental demonstration of a classical approach for flexible space structure control: NASA CSI testbeds

    Bond, Wie


    The results of active control experiments performed for the Mini-Mast truss structure are presented. The primary research objectives were: (1) to develop active structural control concepts and/or techniques; (2) to verify the concept of robust non-minimum-phase compensation for a certain class of non-colocated structural control problems through ground experiments; (3) to verify a 'dipole' concept for persistent disturbance rejection control of flexible structures; and (4) to identify CSI (Control Structure Interaction) issues and areas of emphasis for the next generation of large flexible spacecraft. The classical SISO (Single Input and Single Output) control design approach was employed.

  20. Reliability of Serum Metabolites over a Two-Year Period : A Targeted Metabolomic Approach in Fasting and Non-Fasting Samples from EPIC

    Carayol, Marion; Licaj, Idlir; Achaintre, David; Sacerdote, Carlotta; Vineis, Paolo; Key, Timothy J; Onland Moret, N Charlotte; Scalbert, Augustin; Rinaldi, Sabina; Ferrari, Pietro


    OBJECTIVE: Although metabolic profiles have been associated with chronic disease risk, lack of temporal stability of metabolite levels could limit their use in epidemiological investigations. The present study aims to evaluate the reliability over a two-year period of 158 metabolites and compare rel