WorldWideScience

Sample records for test based models

  1. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  2. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  3. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  4. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  5. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  6. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  7. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  8. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  9. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship...... is estimated using cointegrated VAR models. The results provide evidence against the widely used fully endogenous variety and in favor of the semi-endogenous variety. Forecasts based on the empirical estimates suggest that the slowdown in US productivity growth will continue. Particularly, the annual long...

  10. Towards model-based testing of electronic funds transfer systems

    OpenAIRE

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.

    2010-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the transaction flows specified in the ISO 8583 standard in terms of a Labeled Transition System (LTS). This formalization paves the way for model-based testing based on the formal notion of Input-Outpu...

  11. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  12. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through

  13. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  14. Test-Driven, Model-Based Systems Engineering

    DEFF Research Database (Denmark)

    Munck, Allan

    Hearing systems have evolved over many years from simple mechanical devices (horns) to electronic units consisting of microphones, amplifiers, analog filters, loudspeakers, batteries, etc. Digital signal processors replaced analog filters to provide better performance end new features. Central....... This thesis concerns methods for identifying, selecting and implementing tools for various aspects of model-based systems engineering. A comprehensive method was proposed that include several novel steps such as techniques for analyzing the gap between requirements and tool capabilities. The method...... was verified with good results in two case studies for selection of a traceability tool (single-tool scenario) and a set of modeling tools (multi-tool scenarios). Models must be subjected to testing to allow engineers to predict functionality and performance of systems. Test-first strategies are known...

  15. Model Checking and Model-based Testing in the Railway Domain

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    This chapter describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k...... with good test strength are explained. Interlocking systems represent just one class of many others, where concrete system instances are created from generic representations, using configuration data for determining the behaviour of the instances. We explain how the systematic transition from generic...... to concrete instances in the development path is complemented by associated transitions in the verification and testing paths....

  16. Divergence-based tests for model diagnostic

    Czech Academy of Sciences Publication Activity Database

    Hobza, Tomáš; Esteban, M. D.; Morales, D.; Marhuenda, Y.

    2008-01-01

    Roč. 78, č. 13 (2008), s. 1702-1710 ISSN 0167-7152 R&D Projects: GA MŠk 1M0572 Grant - others:Instituto Nacional de Estadistica (ES) MTM2006-05693 Institutional research plan: CEZ:AV0Z10750506 Keywords : goodness of fit * devergence statistics * GLM * model checking * bootstrap Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.445, year: 2008 http://library.utia.cas.cz/separaty/2008/SI/hobza-divergence-based%20tests%20for%20model%20diagnostic.pdf

  17. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  18. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    Science.gov (United States)

    Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.

  19. Model-based testing in powertrain development; Modellgestuetzte Erprobungsmethodik in der Antriebsstrangentwicklung

    Energy Technology Data Exchange (ETDEWEB)

    Albers, A.; Schyr, C. [Inst. fuer Produktentwicklung der Univ. Karlsruhe (T.H.) (Germany)

    2005-07-01

    The paper describes a new methodical approach for a model-based testing of powertrain components in vehicle development. The presented methodology is based on a holistic model environment which covers the major dynamic effects of the vehicle in an early development phase and refines the models during the testing phase on the different test bed configurations. This allows a realistic loading of the mechanical components and their electronic control units in parallel to a simulation based analysis of design and application variants in the mechanics and software and their influence onto the complete vehicle. In the first application example the development of a pre-adjustable transmission for passenger cars is presented. In the second example the testing concept for tracked vehicles with hydrostatic drivetrain is described. (orig.)

  20. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  1. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Romijn, J.M.T.; Smith, G.; van de Pol, Jan Cornelis

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to

  2. Empirical Modeling of Lithium-ion Batteries Based on Electrochemical Impedance Spectroscopy Tests

    International Nuclear Information System (INIS)

    Samadani, Ehsan; Farhad, Siamak; Scott, William; Mastali, Mehrdad; Gimenez, Leonardo E.; Fowler, Michael; Fraser, Roydon A.

    2015-01-01

    Highlights: • Two commercial Lithium-ion batteries are studied through HPPC and EIS tests. • An equivalent circuit model is developed for a range of operating conditions. • This model improves the current battery empirical models for vehicle applications • This model is proved to be efficient in terms of predicting HPPC test resistances. - ABSTRACT: An empirical model for commercial lithium-ion batteries is developed based on electrochemical impedance spectroscopy (EIS) tests. An equivalent circuit is established according to EIS test observations at various battery states of charge and temperatures. A Laplace transfer time based model is developed based on the circuit which can predict the battery operating output potential difference in battery electric and plug-in hybrid vehicles at various operating conditions. This model demonstrates up to 6% improvement compared to simple resistance and Thevenin models and is suitable for modeling and on-board controller purposes. Results also show that this model can be used to predict the battery internal resistance obtained from hybrid pulse power characterization (HPPC) tests to within 20 percent, making it suitable for low to medium fidelity powertrain design purposes. In total, this simple battery model can be employed as a real-time model in electrified vehicle battery management systems

  3. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  4. Towards model-based testing of electronic funds transfer systems

    NARCIS (Netherlands)

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.; Arbab, F.; Sirjani, M.

    2012-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the

  5. Towards model-based testing of electronic funds transfer systems

    NARCIS (Netherlands)

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.

    2010-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the

  6. Model-Based Testing of a Reactive System with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    2006-01-01

    In this paper, a reactive and nondeterministic system is tested. This is doneby applying a generic model that has been specified as a configurable Coloured PetriNet. In this way, model-based testing is possible for a wide class of reactive system atthe level of discrete events. Concurrently...

  7. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    Science.gov (United States)

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  8. Tree-Based Global Model Tests for Polytomous Rasch Models

    Science.gov (United States)

    Komboz, Basil; Strobl, Carolin; Zeileis, Achim

    2018-01-01

    Psychometric measurement models are only valid if measurement invariance holds between test takers of different groups. Global model tests, such as the well-established likelihood ratio (LR) test, are sensitive to violations of measurement invariance, such as differential item functioning and differential step functioning. However, these…

  9. A model based security testing method for protocol implementation.

    Science.gov (United States)

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  10. Using a data base management system for modelling SSME test history data

    Science.gov (United States)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  11. Experimental Study of Dowel Bar Alternatives Based on Similarity Model Test

    Directory of Open Access Journals (Sweden)

    Chichun Hu

    2017-01-01

    Full Text Available In this study, a small-scaled accelerated loading test based on similarity theory and Accelerated Pavement Analyzer was developed to evaluate dowel bars with different materials and cross-sections. Jointed concrete specimen consisting of one dowel was designed as scaled model for the test, and each specimen was subjected to 864 thousand loading cycles. Deflections between jointed slabs were measured with dial indicators, and strains of the dowel bars were monitored with strain gauges. The load transfer efficiency, differential deflection, and dowel-concrete bearing stress for each case were calculated from these measurements. The test results indicated that the effect of the dowel modulus on load transfer efficiency can be characterized based on the similarity model test developed in the study. Moreover, round steel dowel was found to have similar performance to larger FRP dowel, and elliptical dowel can be preferentially considered in practice.

  12. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    2010-01-01

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence......For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...

  13. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  14. OOK power model based dynamic error testing for smart electricity meter

    International Nuclear Information System (INIS)

    Wang, Xuewei; Chen, Jingxia; Jia, Xiaolu; Zhu, Meng; Yuan, Ruiming; Jiang, Zhenyu

    2017-01-01

    This paper formulates the dynamic error testing problem for a smart meter, with consideration and investigation of both the testing signal and the dynamic error testing method. To solve the dynamic error testing problems, the paper establishes an on-off-keying (OOK) testing dynamic current model and an OOK testing dynamic load energy (TDLE) model. Then two types of TDLE sequences and three modes of OOK testing dynamic power are proposed. In addition, a novel algorithm, which helps to solve the problem of dynamic electric energy measurement’s traceability, is derived for dynamic errors. Based on the above researches, OOK TDLE sequence generation equipment is developed and a dynamic error testing system is constructed. Using the testing system, five kinds of meters were tested in the three dynamic power modes. The test results show that the dynamic error is closely related to dynamic power mode and the measurement uncertainty is 0.38%. (paper)

  15. OOK power model based dynamic error testing for smart electricity meter

    Science.gov (United States)

    Wang, Xuewei; Chen, Jingxia; Yuan, Ruiming; Jia, Xiaolu; Zhu, Meng; Jiang, Zhenyu

    2017-02-01

    This paper formulates the dynamic error testing problem for a smart meter, with consideration and investigation of both the testing signal and the dynamic error testing method. To solve the dynamic error testing problems, the paper establishes an on-off-keying (OOK) testing dynamic current model and an OOK testing dynamic load energy (TDLE) model. Then two types of TDLE sequences and three modes of OOK testing dynamic power are proposed. In addition, a novel algorithm, which helps to solve the problem of dynamic electric energy measurement’s traceability, is derived for dynamic errors. Based on the above researches, OOK TDLE sequence generation equipment is developed and a dynamic error testing system is constructed. Using the testing system, five kinds of meters were tested in the three dynamic power modes. The test results show that the dynamic error is closely related to dynamic power mode and the measurement uncertainty is 0.38%.

  16. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  17. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    Science.gov (United States)

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  18. Top-Down and Bottom-Up Approach for Model-Based Testing of Product Lines

    Directory of Open Access Journals (Sweden)

    Stephan Weißleder

    2013-03-01

    Full Text Available Systems tend to become more and more complex. This has a direct impact on system engineering processes. Two of the most important phases in these processes are requirements engineering and quality assurance. Two significant complexity drivers located in these phases are the growing number of product variants that have to be integrated into the requirements engineering and the ever growing effort for manual test design. There are modeling techniques to deal with both complexity drivers like, e.g., feature modeling and model-based test design. Their combination, however, has been seldom the focus of investigation. In this paper, we present two approaches to combine feature modeling and model-based testing as an efficient quality assurance technique for product lines. We present the corresponding difficulties and approaches to overcome them. All explanations are supported by an example of an online shop product line.

  19. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors

    Directory of Open Access Journals (Sweden)

    Spiros Pagiatakis

    2009-10-01

    Full Text Available In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times. It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF. It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at −40 °C, −20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  20. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    Science.gov (United States)

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  1. IRT-based test construction

    OpenAIRE

    van der Linden, Willem J.; Theunissen, T.J.J.M.; Boekkooi-Timminga, Ellen; Kelderman, Henk

    1987-01-01

    Four discussions of test construction based on item response theory (IRT) are presented. The first discussion, "Test Design as Model Building in Mathematical Programming" (T.J.J.M. Theunissen), presents test design as a decision process under certainty. A natural way of modeling this process leads to mathematical programming. General models of test construction are discussed, with information about algorithms and heuristics; ideas about the analysis and refinement of test constraints are also...

  2. Rank-based Tests of the Cointegrating Rank in Semiparametric Error Correction Models

    NARCIS (Netherlands)

    Hallin, M.; van den Akker, R.; Werker, B.J.M.

    2012-01-01

    Abstract: This paper introduces rank-based tests for the cointegrating rank in an Error Correction Model with i.i.d. elliptical innovations. The tests are asymptotically distribution-free, and their validity does not depend on the actual distribution of the innovations. This result holds despite the

  3. Modelling of XCO2 Surfaces Based on Flight Tests of TanSat Instruments

    Directory of Open Access Journals (Sweden)

    Li Li Zhang

    2016-11-01

    Full Text Available The TanSat carbon satellite is to be launched at the end of 2016. In order to verify the performance of its instruments, a flight test of TanSat instruments was conducted in Jilin Province in September, 2015. The flight test area covered a total area of about 11,000 km2 and the underlying surface cover included several lakes, forest land, grassland, wetland, farmland, a thermal power plant and numerous cities and villages. We modeled the column-average dry-air mole fraction of atmospheric carbon dioxide (XCO2 surface based on flight test data which measured the near- and short-wave infrared (NIR reflected solar radiation in the absorption bands at around 760 and 1610 nm. However, it is difficult to directly analyze the spatial distribution of XCO2 in the flight area using the limited flight test data and the approximate surface of XCO2, which was obtained by regression modeling, which is not very accurate either. We therefore used the high accuracy surface modeling (HASM platform to fill the gaps where there is no information on XCO2 in the flight test area, which takes the approximate surface of XCO2 as its driving field and the XCO2 observations retrieved from the flight test as its optimum control constraints. High accuracy surfaces of XCO2 were constructed with HASM based on the flight’s observations. The results showed that the mean XCO2 in the flight test area is about 400 ppm and that XCO2 over urban areas is much higher than in other places. Compared with OCO-2’s XCO2, the mean difference is 0.7 ppm and the standard deviation is 0.95 ppm. Therefore, the modelling of the XCO2 surface based on the flight test of the TanSat instruments fell within an expected and acceptable range.

  4. Do Test Design and Uses Influence Test Preparation? Testing a Model of Washback with Structural Equation Modeling

    Science.gov (United States)

    Xie, Qin; Andrews, Stephen

    2013-01-01

    This study introduces Expectancy-value motivation theory to explain the paths of influences from perceptions of test design and uses to test preparation as a special case of washback on learning. Based on this theory, two conceptual models were proposed and tested via Structural Equation Modeling. Data collection involved over 870 test takers of…

  5. Fuzzy delay model based fault simulator for crosstalk delay fault test ...

    Indian Academy of Sciences (India)

    In this paper, a fuzzy delay model based crosstalk delay fault simulator is proposed. As design .... To find the quality of non-robust tests, a fuzzy delay ..... Dubois D and Prade H 1989 Processing Fuzzy temporal knowledge. IEEE Transactions ...

  6. Field-based tests of geochemical modeling codes: New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1993-12-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  7. A GMM-Based Test for Normal Disturbances of the Heckman Sample Selection Model

    Directory of Open Access Journals (Sweden)

    Michael Pfaffermayr

    2014-10-01

    Full Text Available The Heckman sample selection model relies on the assumption of normal and homoskedastic disturbances. However, before considering more general, alternative semiparametric models that do not need the normality assumption, it seems useful to test this assumption. Following Meijer and Wansbeek (2007, the present contribution derives a GMM-based pseudo-score LM test on whether the third and fourth moments of the disturbances of the outcome equation of the Heckman model conform to those implied by the truncated normal distribution. The test is easy to calculate and in Monte Carlo simulations it shows good performance for sample sizes of 1000 or larger.

  8. Class hierarchical test case generation algorithm based on expanded EMDPN model

    Institute of Scientific and Technical Information of China (English)

    LI Jun-yi; GONG Hong-fang; HU Ji-ping; ZOU Bei-ji; SUN Jia-guang

    2006-01-01

    A new model of event and message driven Petri network(EMDPN) based on the characteristic of class interaction for messages passing between two objects was extended. Using EMDPN interaction graph, a class hierarchical test-case generation algorithm with cooperated paths (copaths) was proposed, which can be used to solve the problems resulting from the class inheritance mechanism encountered in object-oriented software testing such as oracle, message transfer errors, and unreachable statement. Finally, the testing sufficiency was analyzed with the ordered sequence testing criterion(OSC). The results indicate that the test cases stemmed from newly proposed automatic algorithm of copaths generation satisfies synchronization message sequences testing criteria, therefore the proposed new algorithm of copaths generation has a good coverage rate.

  9. Comparing Science Virtual and Paper-Based Test to Measure Students’ Critical Thinking based on VAK Learning Style Model

    Science.gov (United States)

    Rosyidah, T. H.; Firman, H.; Rusyati, L.

    2017-02-01

    This research was comparing virtual and paper-based test to measure students’ critical thinking based on VAK (Visual-Auditory-Kynesthetic) learning style model. Quasi experiment method with one group post-test only design is applied in this research in order to analyze the data. There was 40 eight grade students at one of public junior high school in Bandung becoming the sample in this research. The quantitative data was obtained through 26 questions about living thing and environment sustainability which is constructed based on the eight elements of critical thinking and be provided in the form of virtual and paper-based test. Based on analysis of the result, it is shown that within visual, auditory, and kinesthetic were not significantly difference in virtual and paper-based test. Besides, all result was supported by quistionnaire about students’ respond on virtual test which shows 3.47 in the scale of 4. Means that student showed positive respond in all aspet measured, which are interest, impression, and expectation.

  10. Scopolamine provocation-based pharmacological MRI model for testing procognitive agents.

    Science.gov (United States)

    Hegedűs, Nikolett; Laszy, Judit; Gyertyán, István; Kocsis, Pál; Gajári, Dávid; Dávid, Szabolcs; Deli, Levente; Pozsgay, Zsófia; Tihanyi, Károly

    2015-04-01

    There is a huge unmet need to understand and treat pathological cognitive impairment. The development of disease modifying cognitive enhancers is hindered by the lack of correct pathomechanism and suitable animal models. Most animal models to study cognition and pathology do not fulfil either the predictive validity, face validity or construct validity criteria, and also outcome measures greatly differ from those of human trials. Fortunately, some pharmacological agents such as scopolamine evoke similar effects on cognition and cerebral circulation in rodents and humans and functional MRI enables us to compare cognitive agents directly in different species. In this paper we report the validation of a scopolamine based rodent pharmacological MRI provocation model. The effects of deemed procognitive agents (donepezil, vinpocetine, piracetam, alpha 7 selective cholinergic compounds EVP-6124, PNU-120596) were compared on the blood-oxygen-level dependent responses and also linked to rodent cognitive models. These drugs revealed significant effect on scopolamine induced blood-oxygen-level dependent change except for piracetam. In the water labyrinth test only PNU-120596 did not show a significant effect. This provocational model is suitable for testing procognitive compounds. These functional MR imaging experiments can be paralleled with human studies, which may help reduce the number of false cognitive clinical trials. © The Author(s) 2015.

  11. TESTING GARCH-X TYPE MODELS

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard; Rahbek, Anders

    2017-01-01

    We present novel theory for testing for reduction of GARCH-X type models with an exogenous (X) covariate to standard GARCH type models. To deal with the problems of potential nuisance parameters on the boundary of the parameter space as well as lack of identification under the null, we exploit...... a noticeable property of specific zero-entries in the inverse information of the GARCH-X type models. Specifically, we consider sequential testing based on two likelihood ratio tests and as demonstrated the structure of the inverse information implies that the proposed test neither depends on whether...... the nuisance parameters lie on the boundary of the parameter space, nor on lack of identification. Our general results on GARCH-X type models are applied to Gaussian based GARCH-X models, GARCH-X models with Student's t-distributed innovations as well as the integer-valued GARCH-X (PAR-X) models....

  12. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  13. GENERATING TEST CASES FOR PLATFORM INDEPENDENT MODEL BY USING USE CASE MODEL

    OpenAIRE

    Hesham A. Hassan,; Zahraa. E. Yousif

    2010-01-01

    Model-based testing refers to testing and test case generation based on a model that describes the behavior of the system. Extensive use of models throughout all the phases of software development starting from the requirement engineering phase has led to increased importance of Model Based Testing. The OMG initiative MDA has revolutionized the way models would be used for software development. Ensuring that all user requirements are addressed in system design and the design is getting suffic...

  14. Field-based tests of geochemical modeling codes usign New Zealand hydrothermal systems

    International Nuclear Information System (INIS)

    Bruton, C.J.; Glassley, W.E.; Bourcier, W.L.

    1994-06-01

    Hydrothermal systems in the Taupo Volcanic Zone, North Island, New Zealand are being used as field-based modeling exercises for the EQ3/6 geochemical modeling code package. Comparisons of the observed state and evolution of the hydrothermal systems with predictions of fluid-solid equilibria made using geochemical modeling codes will determine how the codes can be used to predict the chemical and mineralogical response of the environment to nuclear waste emplacement. Field-based exercises allow us to test the models on time scales unattainable in the laboratory. Preliminary predictions of mineral assemblages in equilibrium with fluids sampled from wells in the Wairakei and Kawerau geothermal field suggest that affinity-temperature diagrams must be used in conjunction with EQ6 to minimize the effect of uncertainties in thermodynamic and kinetic data on code predictions

  15. An Improved Test Selection Optimization Model Based on Fault Ambiguity Group Isolation and Chaotic Discrete PSO

    Directory of Open Access Journals (Sweden)

    Xiaofeng Lv

    2018-01-01

    Full Text Available Sensor data-based test selection optimization is the basis for designing a test work, which ensures that the system is tested under the constraint of the conventional indexes such as fault detection rate (FDR and fault isolation rate (FIR. From the perspective of equipment maintenance support, the ambiguity isolation has a significant effect on the result of test selection. In this paper, an improved test selection optimization model is proposed by considering the ambiguity degree of fault isolation. In the new model, the fault test dependency matrix is adopted to model the correlation between the system fault and the test group. The objective function of the proposed model is minimizing the test cost with the constraint of FDR and FIR. The improved chaotic discrete particle swarm optimization (PSO algorithm is adopted to solve the improved test selection optimization model. The new test selection optimization model is more consistent with real complicated engineering systems. The experimental result verifies the effectiveness of the proposed method.

  16. Test-driven modeling of embedded systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2015-01-01

    To benefit maximally from model-based systems engineering (MBSE) trustworthy high quality models are required. From the software disciplines it is known that test-driven development (TDD) can significantly increase the quality of the products. Using a test-driven approach with MBSE may have...... a similar positive effect on the quality of the system models and the resulting products and may therefore be desirable. To define a test-driven model-based systems engineering (TD-MBSE) approach, we must define this approach for numerous sub disciplines such as modeling of requirements, use cases...... suggest that our method provides a sound foundation for rapid development of high quality system models....

  17. Opening the black box—Development, testing and documentation of a mechanistically rich agent-based model

    DEFF Research Database (Denmark)

    Topping, Chris J.; Høye, Toke; Olesen, Carsten Riis

    2010-01-01

    Although increasingly widely used in biology, complex adaptive simulation models such as agent-based models have been criticised for being difficult to communicate and test. This study demonstrates the application of pattern-oriented model testing, and a novel documentation procedure to present...... accessible description of the processes included in the model. Application of the model to a comprehensive historical data set supported the hypothesis that interference competition is the primary population regulating factor in the absence of mammal predators in the brown hare, and that the effect works...

  18. Models everywhere. How a fully integrated model-based test environment can enable progress in the future

    Energy Technology Data Exchange (ETDEWEB)

    Ben Gaid, Mongi; Lebas, Romain; Fremovici, Morgan; Font, Gregory; Le Solliec, Gunael [IFP Energies nouvelles, Rueil-Malmaison (France); Albrecht, Antoine [D2T Powertrain Engineering, Rueil-Malmaison (France)

    2011-07-01

    The aim of this paper is to demonstrate how advanced modelling approaches coupled with powerful tools allow to set up a complete and coherent test environment suite. Based on a real study focused on the development of a Euro 6 hybrid powertrain with a Euro 5 turbocharged diesel engine, the authors present how a diesel engine simulator including an in-cylinder phenomenological approach to predict the raw emissions can be coupled with a DOC and DPF after-treatment system and embedded in the complete hybrid powertrain to be used in various test environments: - coupled with the control software in a multi-model multi-core simulation platform with test automation features, allowing the simulation speed to be faster than the real-time; - exported in a real time hardware in the loop platform with the ECU and hardware actuators; embedded at the experimental engine test bed to perform driving cycles such as NEDC or FTP cycles with the hybrid powertrain management. Thanks to these complete and versatile test platform suite xMOD/Morphee, all the key issues of a full hybrid powertrain can be addressed efficiently and at low cost compared to the experimental powertrain prototypes: consumption minimisation, energy optimisation, thermal exhaust management. NOx/soots trade off, NO/NO2 ratios.. Having a good balance between versatility and compliancy of the model oriented test platforms such as presented in this paper is the best way to take the maximum benefit of the model developed at each stage of the powertrain development. (orig.)

  19. A Method for Modeling the Virtual Instrument Automatic Test System Based on the Petri Net

    Institute of Scientific and Technical Information of China (English)

    MA Min; CHEN Guang-ju

    2005-01-01

    Virtual instrument is playing the important role in automatic test system. This paper introduces a composition of a virtual instrument automatic test system and takes the VXIbus based a test software platform which is developed by CAT lab of the UESTC as an example. Then a method to model this system based on Petri net is proposed. Through this method, we can analyze the test task scheduling to prevent the deadlock or resources conflict. At last, this paper analyzes the feasibility of this method.

  20. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  1. Genetic screening and testing in an episode-based payment model: preserving patient autonomy.

    Science.gov (United States)

    Sutherland, Sharon; Farrell, Ruth M; Lockwood, Charles

    2014-11-01

    The State of Ohio is implementing an episode-based payment model for perinatal care. All costs of care will be tabulated for each live birth and assigned to the delivering provider, creating a three-tiered model for reimbursement for care. Providers will be reimbursed as usual for care that is average in cost and quality, while instituting rewards or penalties for those outside the expected range in either domain. There are few exclusions, and all methods of genetic screening and diagnostic testing are included in the episode cost calculation as proposed. Prenatal ultrasonography, genetic screening, and diagnostic testing are critical components of the delivery of high-quality, evidence-based prenatal care. These tests provide pregnant women with key information about the pregnancy, which, in turn, allows them to work closely with their health care provider to determine optimal prenatal care. The concepts of informed consent and decision-making, cornerstones of the ethical practice of medicine, are founded on the principles of autonomy and respect for persons. These principles recognize that patients' rights to make choices and take actions are based on their personal beliefs and values. Given the personal nature of such decisions, it is critical that patients have unbarred access to prenatal genetic tests if they elect to use them as part of their prenatal care. The proposed restructuring of reimbursement creates a clear conflict between patient autonomy and physician financial incentives.

  2. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    Science.gov (United States)

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.

  3. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    Science.gov (United States)

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  4. Developing and testing transferability and feasibility of a model for educators using simulation-based learning - A European collaboration

    DEFF Research Database (Denmark)

    Bøje, Rikke Buus; Bland, Andrew; Sutton, Andrew

    2017-01-01

    of the study were to develop a model to educate the educators who deliver simulation-based learning and to test to which extent this model could be transferred to education providers in different national settings. METHODS: This model, its transferability and feasibility, was tested across three European...

  5. Creep Tests and Modeling Based on Continuum Damage Mechanics for T91 and T92 Steels

    Science.gov (United States)

    Pan, J. P.; Tu, S. H.; Zhu, X. W.; Tan, L. J.; Hu, B.; Wang, Q.

    2017-12-01

    9-11%Cr ferritic steels play an important role in high-temperature and high-pressure boilers of advanced power plants. In this paper, a continuum damage mechanics (CDM)-based creep model was proposed to study the creep behavior of T91 and T92 steels at high temperatures. Long-time creep tests were performed for both steels under different conditions. The creep rupture data and creep curves obtained from creep tests were captured well by theoretical calculation based on the CDM model over a long creep time. It is shown that the developed model is able to predict creep data for the two ferritic steels accurately up to tens of thousands of hours.

  6. Industrial-Strength Model-Based Testing - State of the Art and Current Challenges

    Directory of Open Access Journals (Sweden)

    Jan Peleska

    2013-03-01

    Full Text Available As of today, model-based testing (MBT is considered as leading-edge technology in industry. We sketch the different MBT variants that - according to our experience - are currently applied in practice, with special emphasis on the avionic, railway and automotive domains. The key factors for successful industrial-scale application of MBT are described, both from a scientific and a managerial point of view. With respect to the former view, we describe the techniques for automated test case, test data and test procedure generation for concurrent reactive real-time systems which are considered as the most important enablers for MBT in practice. With respect to the latter view, our experience with introducing MBT approaches in testing teams are sketched. Finally, the most challenging open scientific problems whose solutions are bound to improve the acceptance and effectiveness of MBT in industry are discussed.

  7. Evaluation of liquefaction potential of soil based on standard penetration test using multi-gene genetic programming model

    Science.gov (United States)

    Muduli, Pradyut; Das, Sarat

    2014-06-01

    This paper discusses the evaluation of liquefaction potential of soil based on standard penetration test (SPT) dataset using evolutionary artificial intelligence technique, multi-gene genetic programming (MGGP). The liquefaction classification accuracy (94.19%) of the developed liquefaction index (LI) model is found to be better than that of available artificial neural network (ANN) model (88.37%) and at par with the available support vector machine (SVM) model (94.19%) on the basis of the testing data. Further, an empirical equation is presented using MGGP to approximate the unknown limit state function representing the cyclic resistance ratio (CRR) of soil based on developed LI model. Using an independent database of 227 cases, the overall rates of successful prediction of occurrence of liquefaction and non-liquefaction are found to be 87, 86, and 84% by the developed MGGP based model, available ANN and the statistical models, respectively, on the basis of calculated factor of safety (F s) against the liquefaction occurrence.

  8. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  9. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  11. Testlet-Based Multidimensional Adaptive Testing.

    Science.gov (United States)

    Frey, Andreas; Seitz, Nicki-Nils; Brandt, Steffen

    2016-01-01

    Multidimensional adaptive testing (MAT) is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT). MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, and 1.5) and testlet sizes (3, 6, and 9 items) with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  12. Testlet-based Multidimensional Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Andreas Frey

    2016-11-01

    Full Text Available Multidimensional adaptive testing (MAT is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT. MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, 1.5 and testlet sizes (3 items, 6 items, 9 items with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  13. Testing of money multiplier model for Pakistan: does monetary base carry any information?

    Directory of Open Access Journals (Sweden)

    Muhammad Arshad Khan

    2010-02-01

    Full Text Available This paper tests the constancy and stationarity of mechanic version of the money multiplier model for Pakistan using monthly data over the period 1972M1-2009M2. We split the data into pre-liberalization (1972M1-1990M12 and post-liberalization (1991M1-2009M2 periods to examine the impact of financial sector reforms. We first examine the constancy and stationarity of the money multiplier and the results suggest the money multiplier remains non-stationary for the entire sample period and sub-periods. We then tested cointegration between money supply and monetary base and find the evidence of cointegration between two variables for the entire period and two sub-periods. The coefficient restrictions are satisfied only for the post-liberalization period. Two-way long-run causality between money supply and monetary base is found for the entire period and post-liberalization. For the post-liberalization period the evidence of short-run causality running from monetary base to money supply is also identified. On the whole, the results suggest that money multiplier model can serve as framework for conducting short-run monetary policy in Pakistan. However, the monetary authority may consider the co-movements between money supply and reserve money at the time of conducting monetary policy.

  14. A practical model-based statistical approach for generating functional test cases: application in the automotive industry

    OpenAIRE

    Awédikian , Roy; Yannou , Bernard

    2012-01-01

    International audience; With the growing complexity of industrial software applications, industrials are looking for efficient and practical methods to validate the software. This paper develops a model-based statistical testing approach that automatically generates online and offline test cases for embedded software. It discusses an integrated framework that combines solutions for three major software testing research questions: (i) how to select test inputs; (ii) how to predict the expected...

  15. NET model coil test possibilities

    International Nuclear Information System (INIS)

    Erb, J.; Gruenhagen, A.; Herz, W.; Jentzsch, K.; Komarek, P.; Lotz, E.; Malang, S.; Maurer, W.; Noether, G.; Ulbricht, A.; Vogt, A.; Zahn, G.; Horvath, I.; Kwasnitza, K.; Marinucci, C.; Pasztor, G.; Sborchia, C.; Weymuth, P.; Peters, A.; Roeterdink, A.

    1987-11-01

    A single full size coil for NET/INTOR represents an investment of the order of 40 MUC (Million Unit Costs). Before such an amount of money or even more for the 16 TF coils is invested as much risks as possible must be eliminated by a comprehensive development programme. In the course of such a programme a coil technology verification test should finally prove the feasibility of NET/INTOR TF coils. This study report is almost exclusively dealing with such a verification test by model coil testing. These coils will be built out of two Nb 3 Sn-conductors based on two concepts already under development and investigation. Two possible coil arrangements are discussed: A cluster facility, where two model coils out of the two Nb 3 TF-conductors are used, and the already tested LCT-coils producing a background field. A solenoid arrangement, where in addition to the two TF model coils another model coil out of a PF-conductor for the central PF-coils of NET/INTOR is used instead of LCT background coils. Technical advantages and disadvantages are worked out in order to compare and judge both facilities. Costs estimates and the time schedules broaden the base for a decision about the realisation of such a facility. (orig.) [de

  16. Model-Based Prediction of Pulsed Eddy Current Testing Signals from Stratified Conductive Structures

    International Nuclear Information System (INIS)

    Zhang, Jian Hai; Song, Sung Jin; Kim, Woong Ji; Kim, Hak Joon; Chung, Jong Duk

    2011-01-01

    Excitation and propagation of electromagnetic field of a cylindrical coil above an arbitrary number of conductive plates for pulsed eddy current testing(PECT) are very complex problems due to their complicated physical properties. In this paper, analytical modeling of PECT is established by Fourier series based on truncated region eigenfunction expansion(TREE) method for a single air-cored coil above stratified conductive structures(SCS) to investigate their integrity. From the presented expression of PECT, the coil impedance due to SCS is calculated based on analytical approach using the generalized reflection coefficient in series form. Then the multilayered structures manufactured by non-ferromagnetic (STS301L) and ferromagnetic materials (SS400) are investigated by the developed PECT model. Good prediction of analytical model of PECT not only contributes to the development of an efficient solver but also can be applied to optimize the conditions of experimental setup in PECT

  17. GOODNESS-OF-FIT TEST FOR THE ACCELERATED FAILURE TIME MODEL BASED ON MARTINGALE RESIDUALS

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2013-01-01

    Roč. 49, č. 1 (2013), s. 40-59 ISSN 0023-5954 R&D Projects: GA MŠk(CZ) 1M06047 Grant - others:GA MŠk(CZ) SVV 261315/2011 Keywords : accelerated failure time model * survival analysis * goodness-of-fit Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.563, year: 2013 http://library.utia.cas.cz/separaty/2013/SI/novak-goodness-of-fit test for the aft model based on martingale residuals.pdf

  18. T-UPPAAL: Online Model-based Testing of Real-Time Systems

    DEFF Research Database (Denmark)

    Mikucionis, Marius; Larsen, Kim Guldstrand; Nielsen, Brian

    2004-01-01

    The goal of testing is to gain confidence in a physical computer based system by means of executing it. More than one third of typical project resources is spent on testing embedded and real-time systems, but still it remains ad-hoc, based on heuristics, and error-prone. Therefore systematic...

  19. Testing of technology readiness index model based on exploratory factor analysis approach

    Science.gov (United States)

    Ariani, AF; Napitupulu, D.; Jati, RK; Kadar, JA; Syafrullah, M.

    2018-04-01

    SMEs readiness in using ICT will determine the adoption of ICT in the future. This study aims to evaluate the model of technology readiness in order to apply the technology on SMEs. The model is tested to find if TRI model is relevant to measure ICT adoption, especially for SMEs in Indonesia. The research method used in this paper is survey to a group of SMEs in South Tangerang. The survey measures the readiness to adopt ICT based on four variables which is Optimism, Innovativeness, Discomfort, and Insecurity. Each variable contains several indicators to make sure the variable is measured thoroughly. The data collected through survey is analysed using factor analysis methodwith the help of SPSS software. The result of this study shows that TRI model gives more descendants on some indicators and variables. This result can be caused by SMEs owners’ knowledge is not homogeneous about either the technology that they are used, knowledge or the type of their business.

  20. Space Launch System Base Heating Test: Environments and Base Flow Physics

    Science.gov (United States)

    Mehta, Manish; Knox, Kyle S.; Seaford, C. Mark; Dufrene, Aaron T.

    2016-01-01

    The NASA Space Launch System (SLS) vehicle is composed of four RS-25 liquid oxygen- hydrogen rocket engines in the core-stage and two 5-segment solid rocket boosters and as a result six hot supersonic plumes interact within the aft section of the vehicle during ight. Due to the complex nature of rocket plume-induced ows within the launch vehicle base during ascent and a new vehicle con guration, sub-scale wind tunnel testing is required to reduce SLS base convective environment uncertainty and design risk levels. This hot- re test program was conducted at the CUBRC Large Energy National Shock (LENS) II short-duration test facility to simulate ight from altitudes of 50 kft to 210 kft. The test program is a challenging and innovative e ort that has not been attempted in 40+ years for a NASA vehicle. This presentation discusses the various trends of base convective heat ux and pressure as a function of altitude at various locations within the core-stage and booster base regions of the two-percent SLS wind tunnel model. In-depth understanding of the base ow physics is presented using the test data, infrared high-speed imaging and theory. The normalized test design environments are compared to various NASA semi- empirical numerical models to determine exceedance and conservatism of the ight scaled test-derived base design environments. Brief discussion of thermal impact to the launch vehicle base components is also presented.

  1. Testing the ontogenetic base for the transient model of inflorescence development.

    Science.gov (United States)

    Bull-Hereñu, Kester; Claßen-Bockhoff, Regine

    2013-11-01

    Current research in plant science has concentrated on revealing ontogenetic processes of key attributes in plant evolution. One recently discussed model is the 'transient model' successful in explaining some types of inflorescence architectures based on two main principles: the decline of the so called 'vegetativeness' (veg) factor and the transient nature of apical meristems in developing inflorescences. This study examines whether both principles find a concrete ontogenetic correlate in inflorescence development. To test the ontogenetic base of veg decline and the transient character of apical meristems the ontogeny of meristematic size in developing inflorescences was investigated under scanning electron microscopy. Early and late inflorescence meristems were measured and compared during inflorescence development in 13 eudicot species from 11 families. The initial size of the inflorescence meristem in closed inflorescences correlates with the number of nodes in the mature inflorescence. Conjunct compound inflorescences (panicles) show a constant decrease of meristematic size from early to late inflorescence meristems, while disjunct compound inflorescences present an enlargement by merging from early inflorescence meristems to late inflorescence meristems, implying a qualitative change of the apical meristems during ontogeny. Partial confirmation was found for the transient model for inflorescence architecture in the ontogeny: the initial size of the apical meristem in closed inflorescences is consistent with the postulated veg decline mechanism regulating the size of the inflorescence. However, the observed biphasic kinetics of the development of the apical meristem in compound racemes offers the primary explanation for their disjunct morphology, contrary to the putative exclusive transient mechanism in lateral axes as expected by the model.

  2. Model-based framework for multi-axial real-time hybrid simulation testing

    Science.gov (United States)

    Fermandois, Gaston A.; Spencer, Billie F.

    2017-10-01

    Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-offreedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the frame is represented physically in the laboratory as a cantilevered steel column. For realtime execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six

  3. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  4. A test of the California competency-based differentiated role model.

    Science.gov (United States)

    Keating, Sarah B; Rutledge, Dana N; Sargent, Arlene; Walker, Polly

    2003-01-01

    To address the incongruence between the expectations of nursing service and education in California, the Education Industry Interface Task Force of the California Strategic Planning Committee for Nursing developed descriptions to assist employers and educators in clearly differentiating practice and educational competencies. The completion of the Competency-Based Role Differentiation Model resulted in the need to test the model for its utility in the service setting, in education, and for career planning for nurses. Three alpha demonstration sites were selected based on representative geographical regions of California. The sites were composed of tri-partnerships consisting of a medical center, an associate degree in nursing program, and a baccalaureate nursing program. Observers rated senior students and new graduates in medical-surgical units on their behaviors in teacher and leadership care provider and care coordinator roles. The alpha demonstration study results were as expected. That is, senior students practice predominantly at a novice level in teacher and management/leadership care provider functions and new graduates practice predominately at the competent level. New graduates are more likely to take on novice and competent care coordinator roles. The CBRDM may be useful for practice and education settings to evaluate student and nurse performance, to define role expectations, and to identify the preparation necessary for the roles. It is useful for all of nursing as it continues to define its levels of practice and their relationship to on-the-job performance, curriculum development, and carrier planning.

  5. Complete Model-Based Equivalence Class Testing for the ETCS Ceiling Speed Monitor

    DEFF Research Database (Denmark)

    Braunstein, Cécile; Haxthausen, Anne Elisabeth; Huang, Wen-ling

    2014-01-01

    In this paper we present a new test model written in SysML and an associated blackbox test suite for the Ceiling Speed Monitor (CSM) of the European Train Control System (ETCS). The model is publicly available and intended to serve as a novel benchmark for investigating new testing theories...

  6. A spheroid-based 3-D culture model for pancreatic cancer drug testing, using the acid phosphatase assay

    International Nuclear Information System (INIS)

    Wen, Z.; Liao, Q.; Hu, Y.; You, L.; Zhou, L.; Zhao, Y.

    2013-01-01

    Current therapy for pancreatic cancer is multimodal, involving surgery and chemotherapy. However, development of pancreatic cancer therapies requires a thorough evaluation of drug efficacy in vitro before animal testing and subsequent clinical trials. Compared to two-dimensional culture of cell monolayer, three-dimensional (3-D) models more closely mimic native tissues, since the tumor microenvironment established in 3-D models often plays a significant role in cancer progression and cellular responses to the drugs. Accumulating evidence has highlighted the benefits of 3-D in vitro models of various cancers. In the present study, we have developed a spheroid-based, 3-D culture of pancreatic cancer cell lines MIAPaCa-2 and PANC-1 for pancreatic drug testing, using the acid phosphatase assay. Drug efficacy testing showed that spheroids had much higher drug resistance than monolayers. This model, which is characteristically reproducible and easy and offers rapid handling, is the preferred choice for filling the gap between monolayer cell cultures and in vivo models in the process of drug development and testing for pancreatic cancer

  7. A spheroid-based 3-D culture model for pancreatic cancer drug testing, using the acid phosphatase assay

    Directory of Open Access Journals (Sweden)

    Z. Wen

    2013-08-01

    Full Text Available Current therapy for pancreatic cancer is multimodal, involving surgery and chemotherapy. However, development of pancreatic cancer therapies requires a thorough evaluation of drug efficacy in vitro before animal testing and subsequent clinical trials. Compared to two-dimensional culture of cell monolayer, three-dimensional (3-D models more closely mimic native tissues, since the tumor microenvironment established in 3-D models often plays a significant role in cancer progression and cellular responses to the drugs. Accumulating evidence has highlighted the benefits of 3-D in vitro models of various cancers. In the present study, we have developed a spheroid-based, 3-D culture of pancreatic cancer cell lines MIAPaCa-2 and PANC-1 for pancreatic drug testing, using the acid phosphatase assay. Drug efficacy testing showed that spheroids had much higher drug resistance than monolayers. This model, which is characteristically reproducible and easy and offers rapid handling, is the preferred choice for filling the gap between monolayer cell cultures and in vivo models in the process of drug development and testing for pancreatic cancer.

  8. Evidence used in model-based economic evaluations for evaluating pharmacogenetic and pharmacogenomic tests: a systematic review protocol.

    Science.gov (United States)

    Peters, Jaime L; Cooper, Chris; Buchanan, James

    2015-11-11

    Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  10. Evidence synthesis to inform model-based cost-effectiveness evaluations of diagnostic tests: a methodological review of health technology assessments

    Directory of Open Access Journals (Sweden)

    Bethany Shinkins

    2017-04-01

    Full Text Available Abstract Background Evaluations of diagnostic tests are challenging because of the indirect nature of their impact on patient outcomes. Model-based health economic evaluations of tests allow different types of evidence from various sources to be incorporated and enable cost-effectiveness estimates to be made beyond the duration of available study data. To parameterize a health-economic model fully, all the ways a test impacts on patient health must be quantified, including but not limited to diagnostic test accuracy. Methods We assessed all UK NIHR HTA reports published May 2009-July 2015. Reports were included if they evaluated a diagnostic test, included a model-based health economic evaluation and included a systematic review and meta-analysis of test accuracy. From each eligible report we extracted information on the following topics: 1 what evidence aside from test accuracy was searched for and synthesised, 2 which methods were used to synthesise test accuracy evidence and how did the results inform the economic model, 3 how/whether threshold effects were explored, 4 how the potential dependency between multiple tests in a pathway was accounted for, and 5 for evaluations of tests targeted at the primary care setting, how evidence from differing healthcare settings was incorporated. Results The bivariate or HSROC model was implemented in 20/22 reports that met all inclusion criteria. Test accuracy data for health economic modelling was obtained from meta-analyses completely in four reports, partially in fourteen reports and not at all in four reports. Only 2/7 reports that used a quantitative test gave clear threshold recommendations. All 22 reports explored the effect of uncertainty in accuracy parameters but most of those that used multiple tests did not allow for dependence between test results. 7/22 tests were potentially suitable for primary care but the majority found limited evidence on test accuracy in primary care settings

  11. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  12. Error-Rate Estimation Based on Multi-Signal Flow Graph Model and Accelerated Radiation Tests.

    Directory of Open Access Journals (Sweden)

    Wei He

    Full Text Available A method of evaluating the single-event effect soft-error vulnerability of space instruments before launched has been an active research topic in recent years. In this paper, a multi-signal flow graph model is introduced to analyze the fault diagnosis and meantime to failure (MTTF for space instruments. A model for the system functional error rate (SFER is proposed. In addition, an experimental method and accelerated radiation testing system for a signal processing platform based on the field programmable gate array (FPGA is presented. Based on experimental results of different ions (O, Si, Cl, Ti under the HI-13 Tandem Accelerator, the SFER of the signal processing platform is approximately 10-3(error/particle/cm2, while the MTTF is approximately 110.7 h.

  13. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, 31062 Toulouse (France); McKay, Erin [St George Hospital, Gray Street, Kogarah, New South Wales 2217 (Australia); Ferrer, Ludovic [ICO René Gauducheau, Boulevard Jacques Monod, St Herblain 44805 (France); Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila [European Institute of Oncology, Via Ripamonti 435, Milano 20141 (Italy); Bardiès, Manuel [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, Toulouse 31062 (France)

    2015-12-15

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry

  14. Fault Modeling and Testing for Analog Circuits in Complex Space Based on Supply Current and Output Voltage

    Directory of Open Access Journals (Sweden)

    Hongzhi Hu

    2015-01-01

    Full Text Available This paper deals with the modeling of fault for analog circuits. A two-dimensional (2D fault model is first proposed based on collaborative analysis of supply current and output voltage. This model is a family of circle loci on the complex plane, and it simplifies greatly the algorithms for test point selection and potential fault simulations, which are primary difficulties in fault diagnosis of analog circuits. Furthermore, in order to reduce the difficulty of fault location, an improved fault model in three-dimensional (3D complex space is proposed, which achieves a far better fault detection ratio (FDR against measurement error and parametric tolerance. To address the problem of fault masking in both 2D and 3D fault models, this paper proposes an effective design for testability (DFT method. By adding redundant bypassing-components in the circuit under test (CUT, this method achieves excellent fault isolation ratio (FIR in ambiguity group isolation. The efficacy of the proposed model and testing method is validated through experimental results provided in this paper.

  15. Comparison of rate theory based modeling calculations with the surveillance test results of Korean light water reactors

    International Nuclear Information System (INIS)

    Lee, Gyeong Geun; Lee, Yong Bok; Kim, Min Chul; Kwon, Junh Yun

    2012-01-01

    Neutron irradiation to reactor pressure vessel (RPV) steels causes a decrease in fracture toughness and an increase in yield strength while in service. It is generally accepted that the growth of point defect cluster (PDC) and copper rich precipitate (CRP) affects radiation hardening of RPV steels. A number of models have been proposed to account for the embrittlement of RPV steels. The rate theory based modeling mathematically described the evolution of radiation induced microstructures of ferritic steels under neutron irradiation. In this work, we compared the rate theory based modeling calculation with the surveillance test results of Korean Light Water Reactors (LWRs)

  16. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  17. A person fit test for IRT models for polytomous items

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Dagohoy, A.V.

    2007-01-01

    A person fit test based on the Lagrange multiplier test is presented for three item response theory models for polytomous items: the generalized partial credit model, the sequential model, and the graded response model. The test can also be used in the framework of multidimensional ability

  18. Conformance test development with the Java modeling language

    DEFF Research Database (Denmark)

    Søndergaard, Hans; Korsholm, Stephan E.; Ravn, Anders P.

    2017-01-01

    In order to claim conformance with a Java Specification Request, a Java implementation has to pass all tests in an associated Technology Compatibility Kit (TCK). This paper presents a model-based development of a TCK test suite and a test execution tool for the draft Safety-Critical Java (SCJ......) profile specification. The Java Modeling Language (JML) is used to model conformance constraints for the profile. JML annotations define contracts for classes and interfaces. The annotations are translated by a tool into runtime assertion checks.Hereby the design and elaboration of the concrete test cases...

  19. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    Science.gov (United States)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  20. Testing the compounding structure of the CP-INARCH model

    OpenAIRE

    Weiß, Christian H.; Gonçalves, Esmeralda; Lopes, Nazaré Mendes

    2017-01-01

    A statistical test to distinguish between a Poisson INARCH model and a Compound Poisson INARCH model is proposed, based on the form of the probability generating function of the compounding distribution of the conditional law of the model. For first-order autoregression, the normality of the test statistics’ asymptotic distribution is established, either in the case where the model parameters are specified, or when such parameters are consistently estimated. As the test statistics’ law involv...

  1. Evaluation of the base/subgrade soil under repeated loading : phase I--laboratory testing and numerical modeling of geogrid reinforced bases in flexible pavement.

    Science.gov (United States)

    2009-10-01

    This report documents the results of a study that was conducted to characterize the behavior of geogrid reinforced base : course materials. The research was conducted through an experimental testing and numerical modeling programs. The : experimental...

  2. A new fit-for-purpose model testing framework: Decision Crash Tests

    Science.gov (United States)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  3. Environmental risk assessment of selected organic chemicals based on TOC test and QSAR estimation models.

    Science.gov (United States)

    Chi, Yulang; Zhang, Huanteng; Huang, Qiansheng; Lin, Yi; Ye, Guozhu; Zhu, Huimin; Dong, Sijun

    2018-02-01

    Environmental risks of organic chemicals have been greatly determined by their persistence, bioaccumulation, and toxicity (PBT) and physicochemical properties. Major regulations in different countries and regions identify chemicals according to their bioconcentration factor (BCF) and octanol-water partition coefficient (Kow), which frequently displays a substantial correlation with the sediment sorption coefficient (Koc). Half-life or degradability is crucial for the persistence evaluation of chemicals. Quantitative structure activity relationship (QSAR) estimation models are indispensable for predicting environmental fate and health effects in the absence of field- or laboratory-based data. In this study, 39 chemicals of high concern were chosen for half-life testing based on total organic carbon (TOC) degradation, and two widely accepted and highly used QSAR estimation models (i.e., EPI Suite and PBT Profiler) were adopted for environmental risk evaluation. The experimental results and estimated data, as well as the two model-based results were compared, based on the water solubility, Kow, Koc, BCF and half-life. Environmental risk assessment of the selected compounds was achieved by combining experimental data and estimation models. It was concluded that both EPI Suite and PBT Profiler were fairly accurate in measuring the physicochemical properties and degradation half-lives for water, soil, and sediment. However, the half-lives between the experimental and the estimated results were still not absolutely consistent. This suggests deficiencies of the prediction models in some ways, and the necessity to combine the experimental data and predicted results for the evaluation of environmental fate and risks of pollutants. Copyright © 2016. Published by Elsevier B.V.

  4. Development of dynamic Bayesian models for web application test management

    Science.gov (United States)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  5. Modelling and Testing of Friction in Forging

    DEFF Research Database (Denmark)

    Bay, Niels

    2007-01-01

    Knowledge about friction is still limited in forging. The theoretical models applied presently for process analysis are not satisfactory compared to the advanced and detailed studies possible to carry out by plastic FEM analyses and more refined models have to be based on experimental testing...

  6. Identification of walking human model using agent-based modelling

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  7. Modelling, Simulation and Testing of a Reconfigurable Cable-Based Parallel Manipulator as Motion Aiding System

    Directory of Open Access Journals (Sweden)

    Gianni Castelli

    2010-01-01

    Full Text Available This paper presents results on the modelling, simulation and experimental tests of a cable-based parallel manipulator to be used as an aiding or guiding system for people with motion disabilities. There is a high level of motivation for people with a motion disability or the elderly to perform basic daily-living activities independently. Therefore, it is of great interest to design and implement safe and reliable motion assisting and guiding devices that are able to help end-users. In general, a robot for a medical application should be able to interact with a patient in safety conditions, i.e. it must not damage people or surroundings; it must be designed to guarantee high accuracy and low acceleration during the operation. Furthermore, it should not be too bulky and it should exert limited wrenches after close interaction with people. It can be advisable to have a portable system which can be easily brought into and assembled in a hospital or a domestic environment. Cable-based robotic structures can fulfil those requirements because of their main characteristics that make them light and intrinsically safe. In this paper, a reconfigurable four-cable-based parallel manipulator has been proposed as a motion assisting and guiding device to help people to accomplish a number of tasks, such as an aiding or guiding system to move the upper and lower limbs or the whole body. Modelling and simulation are presented in the ADAMS environment. Moreover, experimental tests are reported as based on an available laboratory prototype.

  8. Numerical model of the nanoindentation test based on the digital material representation of the Ti/TiN multilayers

    Directory of Open Access Journals (Sweden)

    Perzyński Konrad

    2015-06-01

    Full Text Available The developed numerical model of a local nanoindentation test, based on the digital material representation (DMR concept, has been presented within the paper. First, an efficient algorithm describing the pulsed laser deposition (PLD process was proposed to realistically recreate the specific morphology of a nanolayered material in an explicit manner. The nanolayered Ti/TiN composite was selected for the investigation. Details of the developed cellular automata model of the PLD process were presented and discussed. Then, the Ti/TiN DMR was incorporated into the finite element software and numerical model of the nanoindentation test was established. Finally, examples of obtained results presenting capabilities of the proposed approach were highlighted.

  9. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  10. Model based design introduction: modeling game controllers to microprocessor architectures

    Science.gov (United States)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  11. Tests of gravity with future space-based experiments

    Science.gov (United States)

    Sakstein, Jeremy

    2018-03-01

    Future space-based tests of relativistic gravitation—laser ranging to Phobos, accelerometers in orbit, and optical networks surrounding Earth—will constrain the theory of gravity with unprecedented precision by testing the inverse-square law, the strong and weak equivalence principles, and the deflection and time delay of light by massive bodies. In this paper, we estimate the bounds that could be obtained on alternative gravity theories that use screening mechanisms to suppress deviations from general relativity in the Solar System: chameleon, symmetron, and Galileon models. We find that space-based tests of the parametrized post-Newtonian parameter γ will constrain chameleon and symmetron theories to new levels, and that tests of the inverse-square law using laser ranging to Phobos will provide the most stringent constraints on Galileon theories to date. We end by discussing the potential for constraining these theories using upcoming tests of the weak equivalence principle, and conclude that further theoretical modeling is required in order to fully utilize the data.

  12. Development and design of a late-model fitness test instrument based on LabView

    Science.gov (United States)

    Xie, Ying; Wu, Feiqing

    2010-12-01

    Undergraduates are pioneers of China's modernization program and undertake the historic mission of rejuvenating our nation in the 21st century, whose physical fitness is vital. A smart fitness test system can well help them understand their fitness and health conditions, thus they can choose more suitable approaches and make practical plans for exercising according to their own situation. following the future trends, a Late-model fitness test Instrument based on LabView has been designed to remedy defects of today's instruments. The system hardware consists of fives types of sensors with their peripheral circuits, an acquisition card of NI USB-6251 and a computer, while the system software, on the basis of LabView, includes modules of user register, data acquisition, data process and display, and data storage. The system, featured by modularization and an open structure, is able to be revised according to actual needs. Tests results have verified the system's stability and reliability.

  13. A testing procedure for wind turbine generators based on the power grid statistical model

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter

    2017-01-01

    In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space...

  14. Mixed Portmanteau Test for Diagnostic Checking of Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2014-01-01

    Full Text Available Model criticism is an important stage of model building and thus goodness of fit tests provides a set of tools for diagnostic checking of the fitted model. Several tests are suggested in literature for diagnostic checking. These tests use autocorrelation or partial autocorrelation in the residuals to criticize the adequacy of fitted model. The main idea underlying these portmanteau tests is to identify if there is any dependence structure which is yet unexplained by the fitted model. In this paper, we suggest mixed portmanteau tests based on autocorrelation and partial autocorrelation functions of the residuals. We derived the asymptotic distribution of the mixture test and studied its size and power using Monte Carlo simulations.

  15. MATLAB-SIMULINK BASED INFORMATION SUPPORT FOR DIGITAL OVERCURRENT PROTECTION TEST SETS

    Directory of Open Access Journals (Sweden)

    I. V. Novash

    2017-01-01

    Full Text Available The implementation of information support for PC-based and hardware-software based sets for digital overcurrent protection devices and their models testing using MatLab-Simulink environment is considered. It is demonstrated that the mathematical modeling of a part of the power system – viz. of the generalized electric power object – could be based on rigid and flexible models. Rigid models implemented on the basis of mathematical description of electrical and magnetic circuits of a power system can be considered as a reference model for the simulation results that have been obtained with the aid of another simulation system to be compared with. It is proposed to implement flexible models for generalized electric power object in the MatLabSimulink environment that includes the SimPowerSystems component library targeted to power system modeling. The features of the parameters calculation of the SimPowerSystems component library blocks that the power system model is formed of are considered. Out of the Simulink standard blocks the models of a wye-connected current transformers were composed as well as the digital overcurrent protection, missing in the component library. A comparison of simulation results of one and the same generalized electric power object implemented in various PC-based software packages was undertaken. The divergence of simulation results did not exceed 3 %; the latter allows us to recommend the MatLab-Simulink environment for information support creation for hardware-software based sets for digital overcurrent protection devices testing. The structure of the hardware-software based set for digital overcurrent protection device testing using the Omicron CMC 356 has been suggested. Time to trip comparison between the real digital protection device МР 801 and the model with the parameters which are exactly match the parameters of the prototype device was carried out using the identical test inputs. The results of the tests

  16. Q-Matrix Optimization Based on the Linear Logistic Test Model.

    Science.gov (United States)

    Ma, Lin; Green, Kelly E

    This study explored optimization of item-attribute matrices with the linear logistic test model (Fischer, 1973), with optimal models explaining more variance in item difficulty due to identified item attributes. Data were 8th-grade mathematics test item responses of two TIMSS 2007 booklets. The study investigated three categories of attributes (content, cognitive process, and comprehensive cognitive process) at two grain levels (larger, smaller) and also compared results with random attribute matrices. The proposed attributes accounted for most of the variance in item difficulty for two assessment booklets (81% and 65%). The variance explained by the content attributes was very small (13% to 31%), less than variance explained by the comprehensive cognitive process attributes which explained much more variance than the content and cognitive process attributes. The variances explained by the grain level were similar to each other. However, the attributes did not predict the item difficulties of two assessment booklets equally.

  17. A Dutch test with the NewProd-model

    NARCIS (Netherlands)

    Bronnenberg, J.J.A.M.; van Engelen, M.L.

    1988-01-01

    The paper contains a report of a test of Cooper's NewProd model for predicting success and failure of product development projects. Based on Canadian data, the model has been shown to make predictions which are 84% correct. Having reservations on the reliability and validity of the model on

  18. Sample Size Determination for Rasch Model Tests

    Science.gov (United States)

    Draxler, Clemens

    2010-01-01

    This paper is concerned with supplementing statistical tests for the Rasch model so that additionally to the probability of the error of the first kind (Type I probability) the probability of the error of the second kind (Type II probability) can be controlled at a predetermined level by basing the test on the appropriate number of observations.…

  19. Predictors of Willingness to Read in English: Testing a Model Based on Possible Selves and Self-Confidence

    Science.gov (United States)

    Khajavy, Gholam Hassan; Ghonsooly, Behzad

    2017-01-01

    The aim of the present study is twofold. First, it tests a model of willingness to read (WTR) based on L2 motivation and communication confidence (communication anxiety and perceived communicative competence). Second, it applies the recent theory of L2 motivation proposed by Dörnyei [2005. "The Psychology of Language Learner: Individual…

  20. Geometrical error calibration in reflective surface testing based on reverse Hartmann test

    Science.gov (United States)

    Gong, Zhidong; Wang, Daodang; Xu, Ping; Wang, Chao; Liang, Rongguang; Kong, Ming; Zhao, Jun; Mo, Linhai; Mo, Shuhui

    2017-08-01

    In the fringe-illumination deflectometry based on reverse-Hartmann-test configuration, ray tracing of the modeled testing system is performed to reconstruct the test surface error. Careful calibration of system geometry is required to achieve high testing accuracy. To realize the high-precision surface testing with reverse Hartmann test, a computer-aided geometrical error calibration method is proposed. The aberrations corresponding to various geometrical errors are studied. With the aberration weights for various geometrical errors, the computer-aided optimization of system geometry with iterative ray tracing is carried out to calibration the geometrical error, and the accuracy in the order of subnanometer is achieved.

  1. User Context Aware Base Station Power Flow Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  2. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  3. A Bootstrap Cointegration Rank Test for Panels of VAR Models

    DEFF Research Database (Denmark)

    Callot, Laurent

    functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...

  4. Design of New Test Function Model Based on Multi-objective Optimization Method

    Directory of Open Access Journals (Sweden)

    Zhaoxia Shang

    2017-01-01

    Full Text Available Space partitioning method, as a new algorism, has been applied to planning and decision-making of investment portfolio more and more often. But currently there are so few testing function for this algorism, which has greatly restrained its further development and application. An innovative test function model is designed in this paper and is used to test the algorism. It is proved that for evaluation of space partitioning method in certain applications, this test function has fairly obvious advantage.

  5. Species delineation using Bayesian model-based assignment tests: a case study using Chinese toad-headed agamas (genus Phrynocephalus

    Directory of Open Access Journals (Sweden)

    Fu Jinzhong

    2010-06-01

    Full Text Available Abstract Background Species are fundamental units in biology, yet much debate exists surrounding how we should delineate species in nature. Species discovery now requires the use of separate, corroborating datasets to quantify independently evolving lineages and test species criteria. However, the complexity of the speciation process has ushered in a need to infuse studies with new tools capable of aiding in species delineation. We suggest that model-based assignment tests are one such tool. This method circumvents constraints with traditional population genetic analyses and provides a novel means of describing cryptic and complex diversity in natural systems. Using toad-headed agamas of the Phrynocephalus vlangalii complex as a case study, we apply model-based assignment tests to microsatellite DNA data to test whether P. putjatia, a controversial species that closely resembles P. vlangalii morphologically, represents a valid species. Mitochondrial DNA and geographic data are also included to corroborate the assignment test results. Results Assignment tests revealed two distinct nuclear DNA clusters with 95% (230/243 of the individuals being assigned to one of the clusters with > 90% probability. The nuclear genomes of the two clusters remained distinct in sympatry, particularly at three syntopic sites, suggesting the existence of reproductive isolation between the identified clusters. In addition, a mitochondrial ND2 gene tree revealed two deeply diverged clades, which were largely congruent with the two nuclear DNA clusters, with a few exceptions. Historical mitochondrial introgression events between the two groups might explain the disagreement between the mitochondrial and nuclear DNA data. The nuclear DNA clusters and mitochondrial clades corresponded nicely to the hypothesized distributions of P. vlangalii and P. putjatia. Conclusions These results demonstrate that assignment tests based on microsatellite DNA data can be powerful tools

  6. Enhancing SAT-Based Test Pattern Generation

    Institute of Scientific and Technical Information of China (English)

    LIU Xin; XIONG You-lun

    2005-01-01

    This paper presents modeling tools based on Boolean satisfiability (SAT) to solve problems of test generation for combinational circuits. It exploits an added layer to maintain circuit-related information and value justification relations to a generic SAT algorithm. It dovetails binary decision graphs (BDD) and SAT techniques to improve the efficiency of automatic test pattern generation (ATPG). More specifically, it first exploits inexpensive reconvergent fanout analysis of circuit to gather information on the local signal correlation by using BDD learning, then uses the above learned information to restrict and focus the overall search space of SAT-based ATPG. Its learning technique is effective and lightweight. The experimental results demonstrate the effectiveness of the approach.

  7. MATT: Multi Agents Testing Tool Based Nets within Nets

    Directory of Open Access Journals (Sweden)

    Sara Kerraoui

    2016-12-01

    As part of this effort, we propose a model based testing approach for multi agent systems based on such a model called Reference net, where a tool, which aims to providing a uniform and automated approach is developed. The feasibility and the advantage of the proposed approach are shown through a short case study.

  8. A Review of Models for Computer-Based Testing. Research Report 2011-12

    Science.gov (United States)

    Luecht, Richard M.; Sireci, Stephen G.

    2011-01-01

    Over the past four decades, there has been incremental growth in computer-based testing (CBT) as a viable alternative to paper-and-pencil testing. However, the transition to CBT is neither easy nor inexpensive. As Drasgow, Luecht, and Bennett (2006) noted, many design engineering, test development, operations/logistics, and psychometric changes…

  9. Modelling inorganic and organic biocide leaching from CBA-amine (Copper–Boron–Azole) treated wood based on characterisation leaching tests

    Energy Technology Data Exchange (ETDEWEB)

    Lupsea, Maria [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F–31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Paris-Est University, CSTB — Scientific and Technical Centre for the Building Industry, DEE/Environment and Life Cycle Engineering Team, 24 Rue Joseph Fourier, F-38400 Saint Martin d' Hères (France); Tiruta-Barna, Ligia, E-mail: ligia.barna@insa-toulouse.fr [University of Toulouse, INSA, UPS, INP, LISBP, 135 Avenue de Rangueil, F–31077 Toulouse (France); INRA, UMR 792, F-31400 Toulouse (France); CNRS, UMR 5504, F-31400 Toulouse (France); Schiopu, Nicoleta [Paris-Est University, CSTB — Scientific and Technical Centre for the Building Industry, DEE/Environment and Life Cycle Engineering Team, 24 Rue Joseph Fourier, F-38400 Saint Martin d' Hères (France); Schoknecht, Ute [BAM — Federal Institute for Materials Research and Testing, Division 4.1, Unter den Eichen 87, 12205 Berlin (Germany)

    2013-09-01

    Numerical simulation of the leaching behaviour of treated wood is the most pertinent and less expensive method for the prediction of biocides' release in water. Few studies based on mechanistic leaching models have been carried out so far. In this work, a coupled chemistry-mass transport model is developed for simulating the leaching behaviour of inorganic (Cu, B) and organic (Tebuconazole) biocides from CBA-amine treated wood. The model is based on experimental investigations (lab-scale leaching tests coupled with chemical and structural analysis). It considers biocides' interactions with wood solid components and with extractives (literature confirmed reactions), as well as transport mechanisms (diffusion, convection) in different compartments. Simulation results helped at identifying the main fixation mechanisms, like (i) direct complexation of Cu by wood-phenolic and -carboxylic sites (and not via monoethanolamine; complex) on lignin and hemicellulose and strong dependence on extractives' nature, (ii) pH dependent binding of tebuconazole on polarized -OH moieties on wood. The role of monoethanolamine is to provide a pore-solution pH of about 7.5, when copper solubility is found to be weakest. The capability of the developed model to simulate the chemical and transport behaviour is the main result of this study. Moreover, it proved that characterization leaching tests (pH dependency and dynamic tests), combined with appropriate analytical methods are useful experimental tools. Due to its flexibility for representing and simulating various leaching conditions, chemical-transport model developed could be used to further simulate the leaching behaviour of CBA treated wood at larger scales. - Highlights: • Biocide and extractives leaching from ammonia-CBA treated wood were modelled. • The chemical-transport model identifies the main fixation/solubilisation mechanisms. • The model describes well the results of equilibrium and dynamic leaching

  10. Testing Social-driven Forces on the Evolution of Sahelian Rural Systems: A Combined Agent-based Modeling and Anthropological Approach

    OpenAIRE

    Saqalli , Mehdi; Gérard , B.; Bielders , C.; Defourny , Pierre

    2010-01-01

    International audience; This article presents the results of a methodology combining an extensive fieldwork, a formalization of field-based individual rules and norms into an agent-based model and the implementation of scenarios analyzing the effects of social and agro-ecological constraints on rural farmers through the study of three different sites in Nigerien Sahel. Two family transition processes are here tested, following field observations and literature-based hypotheses: family organiz...

  11. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  12. Physical modelling and testing in environmental geotechnics

    International Nuclear Information System (INIS)

    Garnier, J.; Thorel, L.; Haza, E.

    2000-01-01

    The preservation of natural environment has become a major concern, which affects nowadays a wide range of professionals from local communities administrators to natural resources managers (water, wildlife, flora, etc) and, in the end, to the consumers that we all are. Although totally ignored some fifty years ago, environmental geotechnics has become an emergent area of study and research which borders on the traditional domains, with which the geo-technicians are confronted (soil and rock mechanics, engineering geology, natural and anthropogenic risk management). Dedicated to experimental approaches (in-situ investigations and tests, laboratory tests, small-scale model testing), the Symposium fits in with the geotechnical domains of environment and transport of soil pollutants. These proceedings report some progress of developments in measurement techniques and studies of transport of pollutants in saturated and unsaturated soils in order to improve our understanding of such phenomena within multiphase environments. Experimental investigations on decontamination and isolation methods for polluted soils are discussed. The intention is to assess the impact of in-situ and laboratory tests, as well as small-scale model testing, on engineering practice. One paper is analysed in INIS data base for its specific interest in nuclear industry. The other ones, concerning the energy, are analyzed in ETDE data base

  13. Social inequality and HIV-testing: Comparing home- and clinic-based testing in rural Malawi

    Directory of Open Access Journals (Sweden)

    Alexander A. Weinreb

    2009-10-01

    Full Text Available The plan to increase HIV testing is a cornerstone of the international health strategy against the HIV/AIDS epidemic, particularly in sub-Saharan Africa. This paper highlights a problematic aspect of that plan: the reliance on clinic- rather than home-based testing. First, drawing on DHS data from across Africa, we demonstrate the substantial differences in socio-demographic and economic profiles between those who report having ever had an HIV test, and those who report never having had one. Then, using data from a random household survey in rural Malawi, we show that substituting home-based for clinic-based testing may eliminate this source of inequality between those tested and those not tested. This result, which is stable across modeling frameworks, has important implications for accurately and equitably addressing the counseling and treatment programs that comprise the international health strategy against AIDS, and that promise to shape the future trajectory of the epidemic in Africa and beyond.

  14. On selection of optimal stochastic model for accelerated life testing

    International Nuclear Information System (INIS)

    Volf, P.; Timková, J.

    2014-01-01

    This paper deals with the problem of proper lifetime model selection in the context of statistical reliability analysis. Namely, we consider regression models describing the dependence of failure intensities on a covariate, for instance, a stressor. Testing the model fit is standardly based on the so-called martingale residuals. Their analysis has already been studied by many authors. Nevertheless, the Bayes approach to the problem, in spite of its advantages, is just developing. We shall present the Bayes procedure of estimation in several semi-parametric regression models of failure intensity. Then, our main concern is the Bayes construction of residual processes and goodness-of-fit tests based on them. The method is illustrated with both artificial and real-data examples. - Highlights: • Statistical survival and reliability analysis and Bayes approach. • Bayes semi-parametric regression modeling in Cox's and AFT models. • Bayes version of martingale residuals and goodness-of-fit test

  15. Testing and Modeling of Mechanical Characteristics of Resistance Welding Machines

    DEFF Research Database (Denmark)

    Wu, Pei; Zhang, Wenqi; Bay, Niels

    2003-01-01

    for both upper and lower electrode systems. This has laid a foundation for modeling the welding process and selecting the welding parameters considering the machine factors. The method is straightforward and easy to be applied in industry since the whole procedure is based on tests with no requirements......The dynamic mechanical response of resistance welding machine is very important to the weld quality in resistance welding especially in projection welding when collapse or deformation of work piece occurs. It is mainly governed by the mechanical parameters of machine. In this paper, a mathematical...... model for characterizing the dynamic mechanical responses of machine and a special test set-up called breaking test set-up are developed. Based on the model and the test results, the mechanical parameters of machine are determined, including the equivalent mass, damping coefficient, and stiffness...

  16. Stimulating Scientific Reasoning with Drawing-Based Modeling

    Science.gov (United States)

    Heijnes, Dewi; van Joolingen, Wouter; Leenaars, Frank

    2018-01-01

    We investigate the way students' reasoning about evolution can be supported by drawing-based modeling. We modified the drawing-based modeling tool SimSketch to allow for modeling evolutionary processes. In three iterations of development and testing, students in lower secondary education worked on creating an evolutionary model. After each…

  17. Bayes Factor Covariance Testing in Item Response Models.

    Science.gov (United States)

    Fox, Jean-Paul; Mulder, Joris; Sinharay, Sandip

    2017-12-01

    Two marginal one-parameter item response theory models are introduced, by integrating out the latent variable or random item parameter. It is shown that both marginal response models are multivariate (probit) models with a compound symmetry covariance structure. Several common hypotheses concerning the underlying covariance structure are evaluated using (fractional) Bayes factor tests. The support for a unidimensional factor (i.e., assumption of local independence) and differential item functioning are evaluated by testing the covariance components. The posterior distribution of common covariance components is obtained in closed form by transforming latent responses with an orthogonal (Helmert) matrix. This posterior distribution is defined as a shifted-inverse-gamma, thereby introducing a default prior and a balanced prior distribution. Based on that, an MCMC algorithm is described to estimate all model parameters and to compute (fractional) Bayes factor tests. Simulation studies are used to show that the (fractional) Bayes factor tests have good properties for testing the underlying covariance structure of binary response data. The method is illustrated with two real data studies.

  18. Fuzzy delay model based fault simulator for crosstalk delay fault test ...

    Indian Academy of Sciences (India)

    In this paper, a fuzzy delay model based crosstalk delay fault simulator is proposed. As design trends move towards nanometer technologies, more number of new parameters affects the delay of the component. Fuzzy delay models are ideal for modelling the uncertainty found in the design and manufacturing steps.

  19. Testing a Dual Process Model of Gender-Based Violence: A Laboratory Examination.

    Science.gov (United States)

    Berke, Danielle S; Zeichner, Amos

    2016-01-01

    The dire impact of gender-based violence on society compels development of models comprehensive enough to capture the diversity of its forms. Research has established hostile sexism (HS) as a robust predictor of gender-based violence. However, to date, research has yet to link men's benevolent sexism (BS) to physical aggression toward women, despite correlations between BS and HS and between BS and victim blaming. One model, the opposing process model of benevolent sexism (Sibley & Perry, 2010), suggests that, for men, BS acts indirectly through HS to predict acceptance of hierarchy-enhancing social policy as an expression of a preference for in-group dominance (i. e., social dominance orientation [SDO]). The extent to which this model applies to gender-based violence remains untested. Therefore, in this study, 168 undergraduate men in a U. S. university participated in a competitive reaction time task, during which they had the option to shock an ostensible female opponent as a measure of gender-based violence. Results of multiple-mediation path analyses indicated dual pathways potentiating gender-based violence and highlight SDO as a particularly potent mechanism of this violence. Findings are discussed in terms of group dynamics and norm-based violence prevention.

  20. Loglinear Rasch model tests

    NARCIS (Netherlands)

    Kelderman, Hendrikus

    1984-01-01

    Existing statistical tests for the fit of the Rasch model have been criticized, because they are only sensitive to specific violations of its assumptions. Contingency table methods using loglinear models have been used to test various psychometric models. In this paper, the assumptions of the Rasch

  1. Space Launch System Base Heating Test: Experimental Operations & Results

    Science.gov (United States)

    Dufrene, Aaron; Mehta, Manish; MacLean, Matthew; Seaford, Mark; Holden, Michael

    2016-01-01

    NASA's Space Launch System (SLS) uses four clustered liquid rocket engines along with two solid rocket boosters. The interaction between all six rocket exhaust plumes will produce a complex and severe thermal environment in the base of the vehicle. This work focuses on a recent 2% scale, hot-fire SLS base heating test. These base heating tests are short-duration tests executed with chamber pressures near the full-scale values with gaseous hydrogen/oxygen engines and RSRMV analogous solid propellant motors. The LENS II shock tunnel/Ludwieg tube tunnel was used at or near flight duplicated conditions up to Mach 5. Model development was based on the Space Shuttle base heating tests with several improvements including doubling of the maximum chamber pressures and duplication of freestream conditions. Test methodology and conditions are presented, and base heating results from 76 runs are reported in non-dimensional form. Regions of high heating are identified and comparisons of various configuration and conditions are highlighted. Base pressure and radiometer results are also reported.

  2. Animal models of toxicology testing: the role of pigs.

    Science.gov (United States)

    Helke, Kristi L; Swindle, Marvin Michael

    2013-02-01

    In regulatory toxicological testing, both a rodent and non-rodent species are required. Historically, dogs and non-human primates (NHP) have been the species of choice of the non-rodent portion of testing. The pig is an appropriate option for these tests based on metabolic pathways utilized in xenobiotic biotransformation. This review focuses on the Phase I and Phase II biotransformation pathways in humans and pigs and highlights the similarities and differences of these models. This is a growing field and references are sparse. Numerous breeds of pigs are discussed along with specific breed differences in these enzymes that are known. While much available data are presented, it is grossly incomplete and sometimes contradictory based on methods used. There is no ideal species to use in toxicology. The use of dogs and NHP in xenobiotic testing continues to be the norm. Pigs present a viable and perhaps more reliable model of non-rodent testing.

  3. Direct-to-consumer advertising of predictive genetic tests: a health belief model based examination of consumer response.

    Science.gov (United States)

    Rollins, Brent L; Ramakrishnan, Shravanan; Perri, Matthew

    2014-01-01

    Direct-to-consumer (DTC) advertising of predictive genetic tests (PGTs) has added a new dimension to health advertising. This study used an online survey based on the health belief model framework to examine and more fully understand consumers' responses and behavioral intentions in response to a PGT DTC advertisement. Overall, consumers reported moderate intentions to talk with their doctor and seek more information about PGTs after advertisement exposure, though consumers did not seem ready to take the advertised test or engage in active information search. Those who perceived greater threat from the disease, however, had significantly greater behavioral intentions and information search behavior.

  4. Development of an evaluation method for fracture mechanical tests on small samples based on a cohesive zone model

    International Nuclear Information System (INIS)

    Mahler, Michael

    2016-01-01

    The safety and reliability of nuclear power plants of the fourth generation is an important issue. It is based on a reliable interpretation of the components for which, among other fracture mechanical material properties are required. The existing irradiation in the power plants significantly affects the material properties which therefore need to be determined on irradiated material. Often only small amounts of irradiated material are available for characterization. In that case it is not possible to manufacture sufficiently large specimens, which are necessary for fracture mechanical testing in agreement with the standard. Small specimens must be used. From this follows the idea of this study, in which the fracture toughness can be predicted with the developed method based on tests of small specimens. For this purpose, the fracture process including the crack growth is described with a continuum mechanical approach using the finite element method and the cohesive zone model. The experiments on small specimens are used for parameter identification of the cohesive zone model. The two parameters of the cohesive zone model are determined by tensile tests on notched specimens (cohesive stress) and by parameter fitting to the fracture behavior of smalls specimens (cohesive energy). To account the different triaxialities of the specimens, the cohesive stress is used depending on the triaxiality. After parameter identification a large specimen can be simulated with the cohesive zone parameters derived from small specimens. The predicted fracture toughness of this big specimen fulfills the size requirements in the standard (ASTM E1820 or ASTM E399) in contrast to the small specimen. This method can be used for ductile and brittle material behavior and was validated in this work. In summary, this method offers the possibility to determine the fracture toughness indirectly based on small specimen testing. Main advantage is the low required specimen volume. Thereby massively

  5. Pile Model Tests Using Strain Gauge Technology

    Science.gov (United States)

    Krasiński, Adam; Kusio, Tomasz

    2015-09-01

    Ordinary pile bearing capacity tests are usually carried out to determine the relationship between load and displacement of pile head. The measurement system required in such tests consists of force transducer and three or four displacement gauges. The whole system is installed at the pile head above the ground level. This approach, however, does not give us complete information about the pile-soil interaction. We can only determine the total bearing capacity of the pile, without the knowledge of its distribution into the shaft and base resistances. Much more information can be obtained by carrying out a test of instrumented pile equipped with a system for measuring the distribution of axial force along its core. In the case of pile model tests the use of such measurement is difficult due to small scale of the model. To find a suitable solution for axial force measurement, which could be applied to small scale model piles, we had to take into account the following requirements: - a linear and stable relationship between measured and physical values, - the force measurement accuracy of about 0.1 kN, - the range of measured forces up to 30 kN, - resistance of measuring gauges against aggressive counteraction of concrete mortar and against moisture, - insensitivity to pile bending, - economical factor. These requirements can be fulfilled by strain gauge sensors if an appropriate methodology is used for test preparation (Hoffmann [1]). In this paper, we focus on some aspects of the application of strain gauge sensors for model pile tests. The efficiency of the method is proved on the examples of static load tests carried out on SDP model piles acting as single piles and in a group.

  6. Pedestrian simulation model based on principles of bounded rationality: results of validation tests

    NARCIS (Netherlands)

    Zhu, W.; Timmermans, H.J.P.; Lo, H.P.; Leung, Stephen C.H.; Tan, Susanna M.L.

    2009-01-01

    Over the years, different modelling approaches to simulating pedestrian movement have been suggested. The majority of pedestrian decision models are based on the concept of utility maximization. To explore alternatives, we developed the heterogeneous heuristic model (HHM), based on principles of

  7. Home-Based Risk of Falling Assessment Test Using a Closed-Loop Balance Model.

    Science.gov (United States)

    Ayena, Johannes C; Zaibi, Helmi; Otis, Martin J-D; Menelas, Bob-Antoine J

    2016-12-01

    The aim of this study is to improve and facilitate the methods used to assess risk of falling at home among older people through the computation of a risk of falling in real time in daily activities. In order to increase a real time computation of the risk of falling, a closed-loop balance model is proposed and compared with One-Leg Standing Test (OLST). This balance model allows studying the postural response of a person having an unpredictable perturbation. Twenty-nine volunteers participated in this study for evaluating the effectiveness of the proposed system which includes seventeen elder participants: ten healthy elderly ( 68.4 ±5.5 years), seven Parkinson's disease (PD) subjects ( 66.28 ±8.9 years), and twelve healthy young adults ( 28.27 ±3.74 years). Our work suggests that there is a relationship between OLST score and the risk of falling based on center of pressure measurement with four low cost force sensors located inside an instrumented insole, which could be predicted using our suggested closed-loop balance model. For long term monitoring at home, this system could be included in a medical electronic record and could be useful as a diagnostic aid tool.

  8. Test models for improving filtering with model errors through stochastic parameter estimation

    International Nuclear Information System (INIS)

    Gershgorin, B.; Harlim, J.; Majda, A.J.

    2010-01-01

    The filtering skill for turbulent signals from nature is often limited by model errors created by utilizing an imperfect model for filtering. Updating the parameters in the imperfect model through stochastic parameter estimation is one way to increase filtering skill and model performance. Here a suite of stringent test models for filtering with stochastic parameter estimation is developed based on the Stochastic Parameterization Extended Kalman Filter (SPEKF). These new SPEKF-algorithms systematically correct both multiplicative and additive biases and involve exact formulas for propagating the mean and covariance including the parameters in the test model. A comprehensive study is presented of robust parameter regimes for increasing filtering skill through stochastic parameter estimation for turbulent signals as the observation time and observation noise are varied and even when the forcing is incorrectly specified. The results here provide useful guidelines for filtering turbulent signals in more complex systems with significant model errors.

  9. A valuation-Based Test of Market Timing

    NARCIS (Netherlands)

    Koeter-Kant, J.; Elliott, W.B.; Warr, R.S.

    2007-01-01

    We implement an earnings-based fundamental valuation model to test the impact of market timing on the firm's method of funding the financing deficit. We argue that our valuation metric provides a superior measure of equity misvaluation because it avoids multiple interpretation problems faced by the

  10. Allele-sharing models: LOD scores and accurate linkage tests.

    Science.gov (United States)

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  11. Radiation Belt Test Model

    Science.gov (United States)

    Freeman, John W.

    2000-10-01

    Rice University has developed a dynamic model of the Earth's radiation belts based on real-time data driven boundary conditions and full adiabaticity. The Radiation Belt Test Model (RBTM) successfully replicates the major features of storm-time behavior of energetic electrons: sudden commencement induced main phase dropout and recovery phase enhancement. It is the only known model to accomplish the latter. The RBTM shows the extent to which new energetic electrons introduced to the magnetosphere near the geostationary orbit drift inward due to relaxation of the magnetic field. It also shows the effects of substorm related rapid motion of magnetotail field lines for which the 3rd adiabatic invariant is violated. The radial extent of this violation is seen to be sharply delineated to a region outside of 5Re, although this distance is determined by the Hilmer-Voigt magnetic field model used by the RBTM. The RBTM appears to provide an excellent platform on which to build parameterized refinements to compensate for unknown acceleration processes inside 5Re where adiabaticity is seen to hold. Moreover, built within the framework of the MSFM, it offers the prospect of an operational forecast model for MeV electrons.

  12. Testing and Modeling of Machine Properties in Resistance Welding

    DEFF Research Database (Denmark)

    Wu, Pei

    The objective of this work has been to test and model the machine properties including the mechanical properties and the electrical properties in resistance welding. The results are used to simulate the welding process more accurately. The state of the art in testing and modeling machine properties...... as real projection welding tests, is easy to realize in industry, since tests may be performed in situ. In part II, an approach of characterizing the electrical properties of AC resistance welding machines is presented, involving testing and mathematical modelling of the weld current, the firing angle...... in resistance welding has been described based on a comprehensive literature study. The present thesis has been subdivided into two parts: Part I: Mechanical properties of resistance welding machines. Part II: Electrical properties of resistance welding machines. In part I, the electrode force in the squeeze...

  13. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  14. Feasibility and effectiveness of two community-based HIV testing models in rural Swaziland.

    Science.gov (United States)

    Parker, Lucy Anne; Jobanputra, Kiran; Rusike, Lorraine; Mazibuko, Sikhathele; Okello, Velephi; Kerschberger, Bernhard; Jouquet, Guillaume; Cyr, Joanne; Teck, Roger

    2015-07-01

    To evaluate the feasibility (population reached, costs) and effectiveness (positivity rates, linkage to care) of two strategies of community-based HIV testing and counselling (HTC) in rural Swaziland. Strategies used were mobile HTC (MHTC) and home-based HTC (HBHTC). Information on age, sex, previous testing and HIV results was obtained from routine HTC records. A consecutive series of individuals testing HIV-positive were followed up for 6 months from the test date to assess linkage to care. A total of 9 060 people were tested: 2 034 through MHTC and 7 026 through HBHTC. A higher proportion of children and adolescents (<20 years) were tested through HBHTC than MHTC (57% vs. 17%; P < 0.001). MHTC reached a higher proportion of adult men than HBHTC (42% vs. 39%; P = 0.015). Of 398 HIV-positive individuals, only 135 (34%) were enrolled in HIV care within 6 months. Of 42 individuals eligible for antiretroviral therapy, 22 (52%) started treatment within 6 months. Linkage to care was lowest among people who had tested previously and those aged 20-40 years. HBHTC was 50% cheaper (US$11 per person tested; $797 per individual enrolled in HIV care) than MHTC ($24 and $1698, respectively). In this high HIV prevalence setting, a community-based testing programme achieved high uptake of testing and appears to be an effective and affordable way to encourage large numbers of people to learn their HIV status (particularly underserved populations such as men and young people). However, for community HTC to impact mortality and incidence, strategies need to be implemented to ensure people testing HIV-positive in the community are linked to HIV care. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  15. Testing a Web-Based, Trained-Peer Model to Build Capacity for Evidence-Based Practices in Community Mental Health Systems.

    Science.gov (United States)

    German, Ramaris E; Adler, Abby; Frankel, Sarah A; Stirman, Shannon Wiltsey; Pinedo, Paola; Evans, Arthur C; Beck, Aaron T; Creed, Torrey A

    2018-03-01

    Use of expert-led workshops plus consultation has been established as an effective strategy for training community mental health (CMH) clinicians in evidence-based practices (EBPs). Because of high rates of staff turnover, this strategy inadequately addresses the need to maintain capacity to deliver EBPs. This study examined knowledge, competency, and retention outcomes of a two-phase model developed to build capacity for an EBP in CMH programs. In the first phase, an initial training cohort in each CMH program participated in in-person workshops followed by expert-led consultation (in-person, expert-led [IPEL] phase) (N=214 clinicians). After this cohort completed training, new staff members participated in Web-based training (in place of in-person workshops), followed by peer-led consultation with the initial cohort (Web-based, trained-peer [WBTP] phase) (N=148). Tests of noninferiority assessed whether WBTP was not inferior to IPEL at increasing clinician cognitive-behavioral therapy (CBT) competency, as measured by the Cognitive Therapy Rating Scale. WBTP was not inferior to IPEL at developing clinician competency. Hierarchical linear models showed no significant differences in CBT knowledge acquisition between the two phases. Survival analyses indicated that WBTP trainees were less likely than IPEL trainees to complete training. In terms of time required from experts, WBTP required 8% of the resources of IPEL. After an initial investment to build in-house CBT expertise, CMH programs were able to use a WBTP model to broaden their own capacity for high-fidelity CBT. IPEL followed by WBTP offers an effective alternative to build EBP capacity in CMH programs, rather than reliance on external experts.

  16. USB environment measurements based on full-scale static engine ground tests

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  17. Using Virtual ATE Model to Migrate Test Programs

    Institute of Scientific and Technical Information of China (English)

    王晓明; 杨乔林

    1995-01-01

    Bacause of high development costs of IC (Integrated Circuit)test programs,recycling existing test programs from one kind of ATE (Automatic Test Equipment) to another or generating directly from CAD simulation modules to ATE is more and more valuable.In this paper,a new approach to migrating test programs is presented.A virtual ATE model based on object-oriented paradigm is developed;it runs Test C++ (an intermediate test control language) programs and TeIF(Test Inftermediate Format-an intermediate pattern),migrates test programs among three kinds of ATE (Ando DIC8032,Schlumberger S15 and GenRad 1732) and generates test patterns from two kinds of CAD 9Daisy and Panda) automatically.

  18. Physics-based modeling of live wildland fuel ignition experiments in the Forced Ignition and Flame Spread Test apparatus

    Science.gov (United States)

    C. Anand; B. Shotorban; S. Mahalingam; S. McAllister; D. R. Weise

    2017-01-01

    A computational study was performed to improve our understanding of the ignition of live fuel in the forced ignition and flame spread test apparatus, a setup where the impact of the heating mode is investigated by subjecting the fuel to forced convection and radiation. An improvement was first made in the physics-based model WFDS where the fuel is treated as fixed...

  19. Performance Analysis of Spotify® for Android with Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Ana Rosario Espada

    2017-01-01

    Full Text Available This paper presents the foundations and the real use of a tool to automatically detect anomalies in Internet traffic produced by mobile applications. In particular, our MVE tool is focused on analyzing the impact that user interactions have on the traffic produced and received by the smartphones. To make the analysis exhaustive with regard to the potential user behaviors, we follow a model-based approach to automatically generate test cases to be executed on the smartphones. In addition, we make use of a specification language to define traffic patterns to be compared with the actual traffic in the device. MVE also includes monitoring and verification support to detect executions that do not fit the patterns. In these cases, the developer will obtain detailed information on the user actions that produce the anomaly in order to improve the application. To validate the approach, the paper presents an experimental study with the well-known Spotify app for Android, in which we detected some interesting behaviors. For instance, some HTTP connections do not end successfully due to timeout errors from the remote Spotify service.

  20. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  1. Free-Suspension Residual Flexibility Testing of Space Station Pathfinder: Comparison to Fixed-Base Results

    Science.gov (United States)

    Tinker, Michael L.

    1998-01-01

    Application of the free-suspension residual flexibility modal test method to the International Space Station Pathfinder structure is described. The Pathfinder, a large structure of the general size and weight of Space Station module elements, was also tested in a large fixed-base fixture to simulate Shuttle Orbiter payload constraints. After correlation of the Pathfinder finite element model to residual flexibility test data, the model was coupled to a fixture model, and constrained modes and frequencies were compared to fixed-base test. modes. The residual flexibility model compared very favorably to results of the fixed-base test. This is the first known direct comparison of free-suspension residual flexibility and fixed-base test results for a large structure. The model correlation approach used by the author for residual flexibility data is presented. Frequency response functions (FRF) for the regions of the structure that interface with the environment (a test fixture or another structure) are shown to be the primary tools for model correlation that distinguish or characterize the residual flexibility approach. A number of critical issues related to use of the structure interface FRF for correlating the model are then identified and discussed, including (1) the requirement of prominent stiffness lines, (2) overcoming problems with measurement noise which makes the antiresonances or minima in the functions difficult to identify, and (3) the use of interface stiffness and lumped mass perturbations to bring the analytical responses into agreement with test data. It is shown that good comparison of analytical-to-experimental FRF is the key to obtaining good agreement of the residual flexibility values.

  2. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  3. Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century

    Science.gov (United States)

    Bejar, Isaac I.; Graf, E. Aurora

    2010-01-01

    The duplex design by Bock and Mislevy for school-based testing is revisited and evaluated as a potential platform in test-based accountability assessments today. We conclude that the model could be useful in meeting the many competing demands of today's test-based accountability assessments, although many research questions will need to be…

  4. Mechanism-based population modelling for assessment of L-cell function based on total GLP-1 response following an oral glucose tolerance test

    DEFF Research Database (Denmark)

    Møller, Jonas B.; Jusko, William J.; Gao, Wei

    2011-01-01

    was to build a mechanism-based population model that describes the time course of total GLP-1 and provides indices for capability of secretion in each subject. The goal was thus to model the secretion of GLP-1, and not its effect on insulin production. Single 75 g doses of glucose were administered orally......GLP-1 is an insulinotropic hormone that synergistically with glucose gives rise to an increased insulin response. Its secretion is increased following a meal and it is thus of interest to describe the secretion of this hormone following an oral glucose tolerance test (OGTT). The aim of this study....... The individual estimates of absorption rate constants were used in the model for GLP-1 secretion. Estimation of parameters was performed using the FOCE method with interaction implemented in NONMEM VI. The final transit/indirect-response model obtained for GLP-1 production following an OGTT included two...

  5. Fiber Bragg Grating-Based Performance Monitoring of Piles Fiber in a Geotechnical Centrifugal Model Test

    Directory of Open Access Journals (Sweden)

    Xiaolin Weng

    2014-01-01

    Full Text Available In centrifugal tests, conventional sensors can hardly capture the performance of reinforcement in small-scale models. However, recent advances in fiber optic sensing technologies enable the accurate and reliable monitoring of strain and temperature in laboratory geotechnical tests. This paper outlines a centrifugal model test, performed using a 60 g ton geocentrifuge, to investigate the performance of pipe piles used to reinforce the loess foundation below a widened embankment. Prior to the test, quasidistributed fiber Bragg grating (FBG strain sensors were attached to the surface of the pipe piles to measure the lateral friction resistance in real time. Via the centrifuge actuator, the driving of pipe piles was simulated. During testing, the variations of skin friction distribution along the pipe piles were measured automatically using an optical fiber interrogator. This paper represents the presentation and detailed analysis of monitoring results. Herein, we verify the reliability of the fiber optic sensors in monitoring the model piles without affecting the integrity of the centrifugal model. This paper, furthermore, shows that lateral friction resistance developed in stages with the pipe piles being pressed in and that this sometimes may become negative.

  6. Model-Free Autotuning Testing on a Model of a Three-Tank Cascade

    Directory of Open Access Journals (Sweden)

    Stanislav VRÁNA

    2009-06-01

    Full Text Available A newly developed model-free autotuning method based on frequency response analysis has been tested on a laboratory set-up that represents a physical model of a three-tank cascade. This laboratory model was chosen for the following reasons: a the laboratory model was ready for computer control; b simultaneously, computer simulation could be effectively utilized, because a mathematical description of the cascade based on quite exactly valid relations was available; c the set-up provided the necessary degree of nonlinearity and changeable properties. The improvement of the laboratory set-up instrumentation presented here was necessary because the results obtained from the first experimental identification did not correspond to the results provided by the simulation. The data was evidently imprecise, because the available sensors and the conditions for process settling were inadequate.

  7. USING OF BYOD MODEL FOR TESTING OF EDUCATIONAL ACHIEVEMENTS ON THE BASIS OF GOOGLE SEARCH SERVICES

    Directory of Open Access Journals (Sweden)

    Tetiana Bondarenko

    2016-04-01

    Full Text Available The technology of using their own mobile devices of learners for testing educational achievements, based on the model of BYOD, in an article is offered. The proposed technology is based on cloud services Google. This technology provides a comprehensive support of testing system: creating appropriate forms, storing the results in cloud storage, processing test results and management of testing system through the use of Google-Calendar. A number of software products based on cloud technologies that allow using BYOD model for testing of educational achievement are described, their strengths and weaknesses are identified. This article also describes the stages of the testing process of the academic achievements of students on the basis of Google search services with using the BYOD model. The proposed approaches to the testing of educational achievements based on using of BYOD model extends the space and time of the testing, makes the test procedure more flexible and systematically, adds to the procedure for testing the elements of a computer game. BYOD model opens up broad prospects for implementation of ICT in all forms of learning process, and particularly in testing of educational achievement in view of the limited computing resources in education

  8. Validation of a Wave-Body Interaction Model by Experimental Tests

    DEFF Research Database (Denmark)

    Ferri, Francesco; Kramer, Morten; Pecher, Arthur

    2013-01-01

    Within the wave energy field, numerical simulation has recently acquired a worldwide consent as being a useful tool, besides physical model testing. The main goal of this work is the validation of a numerical model by experimental results. The numerical model is based on a linear wave-body intera...

  9. Physical modelling and testing in environmental geotechnics

    Energy Technology Data Exchange (ETDEWEB)

    Garnier, J.; Thorel, L.; Haza, E. [Laboratoire Central des Ponts et Chaussees a Nantes, 44 - Nantes (France)

    2000-07-01

    The preservation of natural environment has become a major concern, which affects nowadays a wide range of professionals from local communities administrators to natural resources managers (water, wildlife, flora, etc) and, in the end, to the consumers that we all are. Although totally ignored some fifty years ago, environmental geotechnics has become an emergent area of study and research which borders on the traditional domains, with which the geo-technicians are confronted (soil and rock mechanics, engineering geology, natural and anthropogenic risk management). Dedicated to experimental approaches (in-situ investigations and tests, laboratory tests, small-scale model testing), the Symposium fits in with the geotechnical domains of environment and transport of soil pollutants. These proceedings report some progress of developments in measurement techniques and studies of transport of pollutants in saturated and unsaturated soils in order to improve our understanding of such phenomena within multiphase environments. Experimental investigations on decontamination and isolation methods for polluted soils are discussed. The intention is to assess the impact of in-situ and laboratory tests, as well as small-scale model testing, on engineering practice. One paper has been analyzed in INIS data base for its specific interest in nuclear industry.

  10. Sensitivity testing practice on pre-processing parameters in hard and soft coupled modeling

    Directory of Open Access Journals (Sweden)

    Z. Ignaszak

    2010-01-01

    Full Text Available This paper pays attention to the problem of practical applicability of coupled modeling with the use of hard and soft models types and necessity of adapted to that models data base possession. The data base tests results for cylindrical 30 mm diameter casting made of AlSi7Mg alloy were presented. In simulation tests that were applied the Calcosoft system with CAFE (Cellular Automaton Finite Element module. This module which belongs to „multiphysics” models enables structure prediction of complete casting with division of columnar and equiaxed crystals zones of -phase. Sensitivity tests of coupled model on the particular values parameters changing were made. On these basis it was determined the relations of CET (columnar-to-equaiaxed transition zone position influence. The example of virtual structure validation based on real structure with CET zone location and grain size was shown.

  11. Local and omnibus goodness-of-fit tests in classical measurement error models

    KAUST Repository

    Ma, Yanyuan

    2010-09-14

    We consider functional measurement error models, i.e. models where covariates are measured with error and yet no distributional assumptions are made about the mismeasured variable. We propose and study a score-type local test and an orthogonal series-based, omnibus goodness-of-fit test in this context, where no likelihood function is available or calculated-i.e. all the tests are proposed in the semiparametric model framework. We demonstrate that our tests have optimality properties and computational advantages that are similar to those of the classical score tests in the parametric model framework. The test procedures are applicable to several semiparametric extensions of measurement error models, including when the measurement error distribution is estimated non-parametrically as well as for generalized partially linear models. The performance of the local score-type and omnibus goodness-of-fit tests is demonstrated through simulation studies and analysis of a nutrition data set.

  12. Shelf-Life Prediction of Extra Virgin Olive Oils Using an Empirical Model Based on Standard Quality Tests

    Directory of Open Access Journals (Sweden)

    Claudia Guillaume

    2016-01-01

    Full Text Available Extra virgin olive oil shelf-life could be defined as the length of time under normal storage conditions within which no off-flavours or defects are developed and quality parameters such as peroxide value and specific absorbance are retained within accepted limits for this commercial category. Prediction of shelf-life is a desirable goal in the food industry. Even when extra virgin olive oil shelf-life should be one of the most important quality markers for extra virgin olive oil, it is not recognised as a legal parameter in most regulations and standards around the world. The proposed empirical formula to be evaluated in the present study is based on common quality tests with known and predictable result changes over time and influenced by different aspects of extra virgin olive oil with a meaningful influence over its shelf-life. The basic quality tests considered in the formula are Rancimat® or induction time (IND; 1,2-diacylglycerols (DAGs; pyropheophytin a (PPP; and free fatty acids (FFA. This paper reports research into the actual shelf-life of commercially packaged extra virgin olive oils versus the predicted shelf-life of those oils determined by analysing the expected deterioration curves for the three basic quality tests detailed above. Based on the proposed model, shelf-life is predicted by choosing the lowest predicted shelf-life of any of those three tests.

  13. Refined Diebold-Mariano Test Methods for the Evaluation of Wind Power Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hao Chen

    2014-07-01

    Full Text Available The scientific evaluation methodology for the forecast accuracy of wind power forecasting models is an important issue in the domain of wind power forecasting. However, traditional forecast evaluation criteria, such as Mean Squared Error (MSE and Mean Absolute Error (MAE, have limitations in application to some degree. In this paper, a modern evaluation criterion, the Diebold-Mariano (DM test, is introduced. The DM test can discriminate the significant differences of forecasting accuracy between different models based on the scheme of quantitative analysis. Furthermore, the augmented DM test with rolling windows approach is proposed to give a more strict forecasting evaluation. By extending the loss function to an asymmetric structure, the asymmetric DM test is proposed. Case study indicates that the evaluation criteria based on DM test can relieve the influence of random sample disturbance. Moreover, the proposed augmented DM test can provide more evidence when the cost of changing models is expensive, and the proposed asymmetric DM test can add in the asymmetric factor, and provide practical evaluation of wind power forecasting models. It is concluded that the two refined DM tests can provide reference to the comprehensive evaluation for wind power forecasting models.

  14. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn-Beckers, Petronella; Doldersum, Tom; Useya, Juliana; Augustijn, Dionysius C.M.

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  15. Using Evidence Based Practice in LIS Education: Results of a Test of a Communities of Practice Model

    Directory of Open Access Journals (Sweden)

    Joyce Yukawa

    2010-03-01

    Full Text Available Objective ‐ This study investigated the use of a communities of practice (CoP model for blended learning in library and information science (LIS graduate courses. The purposes were to: (1 test the model’s efficacy in supporting student growth related to core LIS concepts, practices, professional identity, and leadership skills, and (2 develop methods for formative and summative assessment using the model.Methods ‐ Using design‐based research principles to guide the formative and summative assessments, pre‐, mid‐, and post‐course questionnaires were constructed to test the model and administered to students in three LIS courses taught by the author. Participation was voluntary and anonymous. A total of 34 students completed the three courses; response rate for the questionnaires ranged from 47% to 95%. The pre‐course questionnaire addressed attitudes toward technology and the use of technology for learning. The mid‐course questionnaire addressed strengths and weaknesses of the course and suggestions for improvement. The post‐course questionnaire addressed what students valued about their learning and any changes in attitude toward technology for learning. Data were analyzed on three levels. Micro‐level analysis addressed technological factors related to usability and participant skills and attitudes. Meso‐level analysis addressed social and pedagogical factors influencing community learning. Macro‐level analysis addressed CoP learning outcomes, namely, knowledge of core concepts and practices, and the development of professional identity and leadership skills.Results ‐ The students can be characterized as adult learners who were neither early nor late adopters of technology. At the micro‐level, responses indicate that the online tools met high standards of usability and effectively supported online communication and learning. Moreover, the increase in positive attitudes toward the use of technology for learning at

  16. Some tests for parameter constancy in cointegrated VAR-models

    DEFF Research Database (Denmark)

    Hansen, Henrik; Johansen, Søren

    1999-01-01

    Some methods for the evaluation of parameter constancy in vector autoregressive (VAR) models are discussed. Two different ways of re-estimating the VAR model are proposed; one in which all parameters are estimated recursively based upon the likelihood function for the first observations, and anot...... be applied to test the constancy of the long-run parameters in the cointegrated VAR-model. All results are illustrated using a model for the term structure of interest rates on US Treasury securities. ......Some methods for the evaluation of parameter constancy in vector autoregressive (VAR) models are discussed. Two different ways of re-estimating the VAR model are proposed; one in which all parameters are estimated recursively based upon the likelihood function for the first observations......, and another in which the cointegrating relations are estimated recursively from a likelihood function, where the short-run parameters have been concentrated out. We suggest graphical procedures based on recursively estimated eigenvalues to evaluate the constancy of the long-run parameters in the model...

  17. Pescara benchmark: overview of modelling, testing and identification

    International Nuclear Information System (INIS)

    Bellino, A; Garibaldi, L; Marchesiello, S; Brancaleoni, F; Gabriele, S; Spina, D; Bregant, L; Carminelli, A; Catania, G; Sorrentino, S; Di Evangelista, A; Valente, C; Zuccarino, L

    2011-01-01

    The 'Pescara benchmark' is part of the national research project 'BriViDi' (BRIdge VIbrations and DIagnosis) supported by the Italian Ministero dell'Universita e Ricerca. The project is aimed at developing an integrated methodology for the structural health evaluation of railway r/c, p/c bridges. The methodology should provide for applicability in operating conditions, easy data acquisition through common industrial instrumentation, robustness and reliability against structural and environmental uncertainties. The Pescara benchmark consisted in lab tests to get a consistent and large experimental data base and subsequent data processing. Special tests were devised to simulate the train transit effects in actual field conditions. Prestressed concrete beams of current industrial production both sound and damaged at various severity corrosion levels were tested. The results were collected either in a deterministic setting and in a form suitable to deal with experimental uncertainties. Damage identification was split in two approaches: with or without a reference model. In the first case f.e. models were used in conjunction with non conventional updating techniques. In the second case, specialized output-only identification techniques capable to deal with time-variant and possibly non linear systems were developed. The lab tests allowed validating the above approaches and the performances of classical modal based damage indicators.

  18. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  19. Profile control simulations and experiments on TCV: a controller test environment and results using a model-based predictive controller

    Science.gov (United States)

    Maljaars, E.; Felici, F.; Blanken, T. C.; Galperti, C.; Sauter, O.; de Baar, M. R.; Carpanese, F.; Goodman, T. P.; Kim, D.; Kim, S. H.; Kong, M.; Mavkov, B.; Merle, A.; Moret, J. M.; Nouailletas, R.; Scheffer, M.; Teplukhina, A. A.; Vu, N. M. T.; The EUROfusion MST1-team; The TCV-team

    2017-12-01

    The successful performance of a model predictive profile controller is demonstrated in simulations and experiments on the TCV tokamak, employing a profile controller test environment. Stable high-performance tokamak operation in hybrid and advanced plasma scenarios requires control over the safety factor profile (q-profile) and kinetic plasma parameters such as the plasma beta. This demands to establish reliable profile control routines in presently operational tokamaks. We present a model predictive profile controller that controls the q-profile and plasma beta using power requests to two clusters of gyrotrons and the plasma current request. The performance of the controller is analyzed in both simulation and TCV L-mode discharges where successful tracking of the estimated inverse q-profile as well as plasma beta is demonstrated under uncertain plasma conditions and the presence of disturbances. The controller exploits the knowledge of the time-varying actuator limits in the actuator input calculation itself such that fast transitions between targets are achieved without overshoot. A software environment is employed to prepare and test this and three other profile controllers in parallel in simulations and experiments on TCV. This set of tools includes the rapid plasma transport simulator RAPTOR and various algorithms to reconstruct the plasma equilibrium and plasma profiles by merging the available measurements with model-based predictions. In this work the estimated q-profile is merely based on RAPTOR model predictions due to the absence of internal current density measurements in TCV. These results encourage to further exploit model predictive profile control in experiments on TCV and other (future) tokamaks.

  20. Analytical Model of Coil Spring Damper Based on the Loading Test

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Sung Gook; Park, Woong Ki [INNOSE TECH Co. LTD, Incheon (Korea, Republic of); Furuya, Osamu [Tokyo City University, Tokyo (Japan); Kurabayashi, Hiroshi [Vibro-System, Tokyo (Japan)

    2016-05-15

    The one way of solving such problems is to enhance and to develop an improved damping element used in base-isolation and response control system. A cost reduction of damper for a large scale structure is another important task to upgrade the total response control abilities in the near future. This study has examined a response control device using elastoplastic hysteresis damping of metal material. The proposed damper is designed to be coil spring element shape for a uniform stress of metal and for a reduction of low cyclic fatigue in large deformation to upgrade a repetitive strength during the earthquake motions. By using the metal material of SS400 general structural rolled steel, the corresponding cost issues of the damping element will be effectively reduced. The analytical of elasto-plastic coil spring damper (CSD) is introduced, and basic mechanical properties evaluated experimentally and analytically. This study has been examined the response control damper using elasto-plastic hysteresis characteristics of metal material. The paper described the design method of elasto-plastic coil spring damper, basic mechanical properties evaluated from loading test, and analytical model of damper are summarized. It was confirmed that the damping force and mechanical characteristics of elasto-plastic coil spring damper are almost satisfied the design specifications.

  1. Black hole based tests of general relativity

    International Nuclear Information System (INIS)

    Yagi, Kent; Stein, Leo C

    2016-01-01

    General relativity has passed all solar system experiments and neutron star based tests, such as binary pulsar observations, with flying colors. A more exotic arena for testing general relativity is in systems that contain one or more black holes. Black holes are the most compact objects in the Universe, providing probes of the strongest-possible gravitational fields. We are motivated to study strong-field gravity since many theories give large deviations from general relativity only at large field strengths, while recovering the weak-field behavior. In this article, we review how one can probe general relativity and various alternative theories of gravity by using electromagnetic waves from a black hole with an accretion disk, and gravitational waves from black hole binaries. We first review model-independent ways of testing gravity with electromagnetic/gravitational waves from a black hole system. We then focus on selected examples of theories that extend general relativity in rather simple ways. Some important characteristics of general relativity include (but are not limited to) (i) only tensor gravitational degrees of freedom, (ii) the graviton is massless, (iii) no quadratic or higher curvatures in the action, and (iv) the theory is four-dimensional. Altering a characteristic leads to a different extension of general relativity: (i) scalar–tensor theories, (ii) massive gravity theories, (iii) quadratic gravity, and (iv) theories with large extra dimensions. Within each theory, we describe black hole solutions, their properties, and current and projected constraints on each theory using black hole based tests of gravity. We close this review by listing some of the open problems in model-independent tests and within each specific theory. (paper)

  2. Inverse hydrochemical models of aqueous extracts tests

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, L.; Samper, J.; Montenegro, L.

    2008-10-10

    Aqueous extract test is a laboratory technique commonly used to measure the amount of soluble salts of a soil sample after adding a known mass of distilled water. Measured aqueous extract data have to be re-interpreted in order to infer porewater chemical composition of the sample because porewater chemistry changes significantly due to dilution and chemical reactions which take place during extraction. Here we present an inverse hydrochemical model to estimate porewater chemical composition from measured water content, aqueous extract, and mineralogical data. The model accounts for acid-base, redox, aqueous complexation, mineral dissolution/precipitation, gas dissolution/ex-solution, cation exchange and surface complexation reactions, of which are assumed to take place at local equilibrium. It has been solved with INVERSE-CORE{sup 2D} and been tested with bentonite samples taken from FEBEX (Full-scale Engineered Barrier EXperiment) in situ test. The inverse model reproduces most of the measured aqueous data except bicarbonate and provides an effective, flexible and comprehensive method to estimate porewater chemical composition of clays. Main uncertainties are related to kinetic calcite dissolution and variations in CO2(g) pressure.

  3. Tests for detecting overdispersion in models with measurement error in covariates.

    Science.gov (United States)

    Yang, Yingsi; Wong, Man Yu

    2015-11-30

    Measurement error in covariates can affect the accuracy in count data modeling and analysis. In overdispersion identification, the true mean-variance relationship can be obscured under the influence of measurement error in covariates. In this paper, we propose three tests for detecting overdispersion when covariates are measured with error: a modified score test and two score tests based on the proposed approximate likelihood and quasi-likelihood, respectively. The proposed approximate likelihood is derived under the classical measurement error model, and the resulting approximate maximum likelihood estimator is shown to have superior efficiency. Simulation results also show that the score test based on approximate likelihood outperforms the test based on quasi-likelihood and other alternatives in terms of empirical power. By analyzing a real dataset containing the health-related quality-of-life measurements of a particular group of patients, we demonstrate the importance of the proposed methods by showing that the analyses with and without measurement error correction yield significantly different results. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses

    NARCIS (Netherlands)

    Kuiper, Rebecca M.; Nederhoff, Tim; Klugkist, Irene

    2015-01-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is

  5. An Empirical Test of a Model of Resistance to Persuasion.

    Science.gov (United States)

    And Others; Burgoon, Michael

    1978-01-01

    Tests a model of resistance to persuasion based upon variables not considered by earlier congruity and inoculation models. Supports the prediction that the kind of critical response set induced and the target of the criticism are mediators of resistance to persuasion. (JMF)

  6. A general diagnostic model applied to language testing data.

    Science.gov (United States)

    von Davier, Matthias

    2008-11-01

    Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

  7. Development and tests of a mouse voxel model dor MCNPX based on Digimouse images

    Energy Technology Data Exchange (ETDEWEB)

    Melo M, B.; Ferreira F, C. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Pte. Antonio Carlos No. 6627, Belo Horizonte 31270-901, Minas Gerais (Brazil); Garcia de A, I.; Machado T, B.; Passos Ribeiro de C, T., E-mail: bmm@cdtn.br [Universidade Federal de Minas Gerais, Departamento de Engenharia Nuclear, Pte. Antonio Carlos 6627, Belo Horizonte 31270-901, Minas Gerais (Brazil)

    2015-10-15

    Mice have been widely used in experimental protocols involving ionizing radiation. Biological effects (Be) induced by radiation can compromise studies results. Good estimates of mouse whole body and organs absorbed dose could provide valuable information to researchers. The aim of this study was to create and test a new voxel phantom for mice dosimetry from -Digimouse- project images. Micro CT images from Digimouse project were used in this work. Corel PHOTOPAINT software was utilized in segmentation process. The three-dimensional (3-D) model assembly and its voxel size manipulation were performed by Image J. SISCODES was used to adapt the model to run in MCNPX Monte Carlo code. The resulting model was called DM{sub B}RA. The volume and mass of segmented organs were compared with data available in literature. For the preliminary tests the heart was considered the source organ. Photons of diverse energies were simulated and Saf values obtained through F6:p and + F6 MCNPX tallies. The results were compared with reference data. 3-D picturing of absorbed doses patterns and relative errors distribution were generated by a C++ -in house- made program and visualized through Amide software. The organ masses of DM{sub B}RA correlated well with two models that were based on same set of images. However some organs, like eyes and adrenals, skeleton and brain showed large discrepancies. Segmentation of an identical image set by different persons and/or methods can result significant organ masses variations. We believe that the main causes of these differences were: i) operator dependent subjectivity in the definition of organ limits during the segmentation processes; and i i) distinct voxel dimensions between evaluated models. Lack of reference data for mice models construction and dosimetry was detected. Comparison with other models originated from different mice strains also demonstrated that the anatomical and size variability can be significant. Use of + F6 tally for mouse

  8. Development and tests of a mouse voxel model dor MCNPX based on Digimouse images

    International Nuclear Information System (INIS)

    Melo M, B.; Ferreira F, C.; Garcia de A, I.; Machado T, B.; Passos Ribeiro de C, T.

    2015-10-01

    Mice have been widely used in experimental protocols involving ionizing radiation. Biological effects (Be) induced by radiation can compromise studies results. Good estimates of mouse whole body and organs absorbed dose could provide valuable information to researchers. The aim of this study was to create and test a new voxel phantom for mice dosimetry from -Digimouse- project images. Micro CT images from Digimouse project were used in this work. Corel PHOTOPAINT software was utilized in segmentation process. The three-dimensional (3-D) model assembly and its voxel size manipulation were performed by Image J. SISCODES was used to adapt the model to run in MCNPX Monte Carlo code. The resulting model was called DM B RA. The volume and mass of segmented organs were compared with data available in literature. For the preliminary tests the heart was considered the source organ. Photons of diverse energies were simulated and Saf values obtained through F6:p and + F6 MCNPX tallies. The results were compared with reference data. 3-D picturing of absorbed doses patterns and relative errors distribution were generated by a C++ -in house- made program and visualized through Amide software. The organ masses of DM B RA correlated well with two models that were based on same set of images. However some organs, like eyes and adrenals, skeleton and brain showed large discrepancies. Segmentation of an identical image set by different persons and/or methods can result significant organ masses variations. We believe that the main causes of these differences were: i) operator dependent subjectivity in the definition of organ limits during the segmentation processes; and i i) distinct voxel dimensions between evaluated models. Lack of reference data for mice models construction and dosimetry was detected. Comparison with other models originated from different mice strains also demonstrated that the anatomical and size variability can be significant. Use of + F6 tally for mouse phantoms

  9. Testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form

    DEFF Research Database (Denmark)

    Péguin-Feissolle, Anne; Strikholm, Birgit; Teräsvirta, Timo

    In this paper we propose a general method for testing the Granger noncausality hypothesis in stationary nonlinear models of unknown functional form. These tests are based on a Taylor expansion of the nonlinear model around a given point in the sample space. We study the performance of our tests b...

  10. Impact of Participatory Health Research: A Test of the Community-Based Participatory Research Conceptual Model

    Directory of Open Access Journals (Sweden)

    John G. Oetzel

    2018-01-01

    Full Text Available Objectives. A key challenge in evaluating the impact of community-based participatory research (CBPR is identifying what mechanisms and pathways are critical for health equity outcomes. Our purpose is to provide an empirical test of the CBPR conceptual model to address this challenge. Methods. A three-stage quantitative survey was completed: (1 294 US CBPR projects with US federal funding were identified; (2 200 principal investigators completed a questionnaire about project-level details; and (3 450 community or academic partners and principal investigators completed a questionnaire about perceived contextual, process, and outcome variables. Seven in-depth qualitative case studies were conducted to explore elements of the model not captured in the survey; one is presented due to space limitations. Results. We demonstrated support for multiple mechanisms illustrated by the conceptual model using a latent structural equation model. Significant pathways were identified, showing the positive association of context with partnership structures and dynamics. Partnership structures and dynamics showed similar associations with partnership synergy and community involvement in research; both of these had positive associations with intermediate community changes and distal health outcomes. The case study complemented and extended understandings of the mechanisms of how partnerships can improve community conditions. Conclusions. The CBPR conceptual model is well suited to explain key relational and structural pathways for impact on health equity outcomes.

  11. Testing the Community-Based Learning Collaborative (CBLC) implementation model: a study protocol.

    Science.gov (United States)

    Hanson, Rochelle F; Schoenwald, Sonja; Saunders, Benjamin E; Chapman, Jason; Palinkas, Lawrence A; Moreland, Angela D; Dopp, Alex

    2016-01-01

    High rates of youth exposure to violence, either through direct victimization or witnessing, result in significant health/mental health consequences and high associated lifetime costs. Evidence-based treatments (EBTs), such as Trauma-Focused Cognitive Behavioral Therapy (TF-CBT), can prevent and/or reduce these negative effects, yet these treatments are not standard practice for therapists working with children identified by child welfare or mental health systems as needing services. While research indicates that collaboration among child welfare and mental health services sectors improves availability and sustainment of EBTs for children, few implementation strategies designed specifically to promote and sustain inter-professional collaboration (IC) and inter-organizational relationships (IOR) have undergone empirical investigation. A potential candidate for evaluation is the Community-Based Learning Collaborative (CBLC) implementation model, an adaptation of the Learning Collaborative which includes strategies designed to develop and strengthen inter-professional relationships between brokers and providers of mental health services to promote IC and IOR and achieve sustained implementation of EBTs for children within a community. This non-experimental, mixed methods study involves two phases: (1) analysis of existing prospective quantitative and qualitative quality improvement and project evaluation data collected pre and post, weekly, and monthly from 998 participants in one of seven CBLCs conducted as part of a statewide initiative; and (2) Phase 2 collection of new quantitative and qualitative (key informant interviews) data during the funded study period to evaluate changes in relations among IC, IOR, social networks and the penetration and sustainment of TF-CBT in targeted communities. Recruitment for Phase 2 is from the pool of 998 CBLC participants to achieve a targeted enrollment of n = 150. Study aims include: (1) Use existing quality improvement

  12. Test of Antifibrotic Drugs in a Cellular Model of Fibrosis Based on Muscle-Derived Fibroblasts from Duchenne Muscular Dystrophy Patients.

    Science.gov (United States)

    Zanotti, Simona; Mora, Marina

    2018-01-01

    An in vitro model of muscle fibrosis, based on the use of primary human fibroblasts isolated from muscle biopsies of patients affected by Duchenne muscular dystrophies (DMD) and cultivated in monolayer and 3D conditions, is used to test the potential antifibrotic activity of pirfenidone (PFD). This in vitro model may be usefully also to evaluate the toxicity and efficacy of other candidate molecules for the treatment of fibrosis. The drug toxicity is evaluated using a colorimetric assay based on the conversion of tetrazolium salt (MTT) to insoluble formazan, while the effect of the drug on cell proliferation is measured with the bromodeoxyuridine incorporation assay. The efficacy of the drug is evaluated in fibroblast monolayers by quantitating synthesis and deposition of intracellular collagen with a spectrophotometric picrosirius red-based assay, and by quantitating cell migration using a "scratch" assay. The efficacy of PFD as antifibrotic drug is also evaluated in a 3D fibroblast model by measuring diameters and number of nodules.

  13. Dynamic material characterization by combining ballistic testing and an engineering model

    NARCIS (Netherlands)

    Carton, E.P.; Roebroeks, G.H.J.J.; Wal, R. van der

    2013-01-01

    At TNO several energy-based engineering models have been created for various failure mechanism occurring in ballistic testing of materials, like ductile hole growth, denting, plugging, etc. Such models are also under development for ceramic and fiberbased materials (fabrics). As the models are

  14. Reliable Prediction of Insulin Resistance by a School-Based Fitness Test in Middle-School Children

    Directory of Open Access Journals (Sweden)

    Allen DavidB

    2009-09-01

    Full Text Available Objectives. (1 Determine the predictive value of a school-based test of cardiovascular fitness (CVF for insulin resistance (IR; (2 compare a "school-based" prediction of IR to a "laboratory-based" prediction, using various measures of fitness and body composition. Methods. Middle school children ( performed the Progressive Aerobic Cardiovascular Endurance Run (PACER, a school-based CVF test, and underwent evaluation of maximal oxygen consumption treadmill testing ( max, body composition (percent body fat and BMI z score, and IR (derived homeostasis model assessment index []. Results. PACER showed a strong correlation with max/kg ( = 0.83, and with ( = , . Multivariate regression analysis revealed that a school-based model (using PACER and BMI z score predicted IR similar to a laboratory-based model (using max/kg of lean body mass and percent body fat. Conclusions. The PACER is a valid school-based test of CVF, is predictive of IR, and has a similar relationship to IR when compared to complex laboratory-based testing. Simple school-based measures of childhood fitness (PACER and fatness (BMI z score could be used to identify childhood risk for IR and evaluate interventions.

  15. TR-EDB: Test Reactor Embrittlement Data Base, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Stallmann, F.W.; Wang, J.A.; Kam, F.B.K. [Oak Ridge National Lab., TN (United States)

    1994-01-01

    The Test Reactor Embrittlement Data Base (TR-EDB) is a collection of results from irradiation in materials test reactors. It complements the Power Reactor Embrittlement Data Base (PR-EDB), whose data are restricted to the results from the analysis of surveillance capsules in commercial power reactors. The rationale behind their restriction was the assumption that the results of test reactor experiments may not be applicable to power reactors and could, therefore, be challenged if such data were included. For this very reason the embrittlement predictions in the Reg. Guide 1.99, Rev. 2, were based exclusively on power reactor data. However, test reactor experiments are able to cover a much wider range of materials and irradiation conditions that are needed to explore more fully a variety of models for the prediction of irradiation embrittlement. These data are also needed for the study of effects of annealing for life extension of reactor pressure vessels that are difficult to obtain from surveillance capsule results.

  16. TR-EDB: Test Reactor Embrittlement Data Base, Version 1

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Wang, J.A.; Kam, F.B.K.

    1994-01-01

    The Test Reactor Embrittlement Data Base (TR-EDB) is a collection of results from irradiation in materials test reactors. It complements the Power Reactor Embrittlement Data Base (PR-EDB), whose data are restricted to the results from the analysis of surveillance capsules in commercial power reactors. The rationale behind their restriction was the assumption that the results of test reactor experiments may not be applicable to power reactors and could, therefore, be challenged if such data were included. For this very reason the embrittlement predictions in the Reg. Guide 1.99, Rev. 2, were based exclusively on power reactor data. However, test reactor experiments are able to cover a much wider range of materials and irradiation conditions that are needed to explore more fully a variety of models for the prediction of irradiation embrittlement. These data are also needed for the study of effects of annealing for life extension of reactor pressure vessels that are difficult to obtain from surveillance capsule results

  17. The Test Reactor Embrittlement Data Base (TR-EDB)

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Kam, F.B.K.; Wang, J.A.

    1993-01-01

    The Test Reactor Embrittlement Data Base (TR-EDB) is part of an ongoing program to collect test data from materials irradiations to aid in the research and evaluation of embrittlement prediction models that are used to assure the safety of pressure vessels in power reactors. This program is being funded by the US Nuclear Regulatory Commission (NRC) and has resulted in the publication of the Power Reactor Embrittlement Data Base (PR-EDB) whose second version is currently being released. The TR-EDB is a compatible collection of data from experiments in materials test reactors. These data contain information that is not obtainable from surveillance results, especially, about the effects of annealing after irradiation. Other information that is only available from test reactors is the influence of fluence rates and irradiation temperatures on radiation embrittlement. The first version of the TR-EDB will be released in fall of 1993 and contains published results from laboratories in many countries. Data collection will continue and further updates will be published

  18. Modeling and experimental tests of a copper thermosyphon

    Directory of Open Access Journals (Sweden)

    Paulo Henrique Dias dos Santos

    2017-02-01

    Full Text Available Electrical energy, solar energy, and/or direct combustion of a fuel are the most common thermal sources for home water heating. In recent years, the use of solar energy has become popular because it is a renewable and economic energy source. Among the solar collectors, those assisted by thermosyphons are more efficient; therefore, they can enhance the heat transfer to water. A thermosyphon is basically a sealed tube filled with a working fluid and, normally, it has three regions: the evaporator, the adiabatic section and the condenser. The great advantage of this device is that the thermal resistance to heat transfer between its regions is very small, and as a result, there is a small temperature difference. This article aims to model a thermosyphon by using correlations based on its operation limits. This modeling will be used as a design tool for compact solar collectors assisted by thermosyphons. Based on the results obtained with the mathematical modeling, one copper thermosyphon, with deionized water as the working fluid, was developed and experimentally tested. The tests were carried out for a heat load varying from 30 to 60W in a vertical position. The theoretical and experimental results were compared to verify the mathematical model.

  19. Inverse problems in the design, modeling and testing of engineering systems

    Science.gov (United States)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  20. Cohesive Zone Model Based Numerical Analysis of Steel-Concrete Composite Structure Push-Out Tests

    Directory of Open Access Journals (Sweden)

    J. P. Lin

    2014-01-01

    Full Text Available Push-out tests were widely used to determine the shear bearing capacity and shear stiffness of shear connectors in steel-concrete composite structures. The finite element method was one efficient alternative to push-out testing. This paper focused on a simulation analysis of the interface between concrete slabs and steel girder flanges as well as the interface of the shear connectors and the surrounding concrete. A cohesive zone model was used to simulate the tangential sliding and normal separation of the interfaces. Then, a zero-thickness cohesive element was implemented via the user-defined element subroutine UEL in the software ABAQUS, and a multiple broken line mode was used to define the constitutive relations of the cohesive zone. A three-dimensional numerical analysis model was established for push-out testing to analyze the load-displacement curves of the push-out test process, interface relative displacement, and interface stress distribution. This method was found to accurately calculate the shear capacity and shear stiffness of shear connectors. The numerical results showed that the multiple broken lines mode cohesive zone model could describe the nonlinear mechanical behavior of the interface between steel and concrete and that a discontinuous deformation numerical simulation could be implemented.

  1. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  2. Reactor noise diagnostics based on multivariate autoregressive modeling: Application to LOFT [Loss-of-Fluid-Test] reactor process noise

    International Nuclear Information System (INIS)

    Gloeckler, O.; Upadhyaya, B.R.

    1987-01-01

    Multivariate noise analysis of power reactor operating signals is useful for plant diagnostics, for isolating process and sensor anomalies, and for automated plant monitoring. In order to develop a reliable procedure, the previously established techniques for empirical modeling of fluctuation signals in power reactors have been improved. Application of the complete algorithm to operational data from the Loss-of-Fluid-Test (LOFT) Reactor showed that earlier conjectures (based on physical modeling) regarding the perturbation sources in a Pressurized Water Reactor (PWR) affecting coolant temperature and neutron power fluctuations can be systematically explained. This advanced methodology has important implication regarding plant diagnostics, and system or sensor anomaly isolation. 6 refs., 24 figs

  3. Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... of non-stationary non-linear time series models. Thus the paper provides a full asymptotic theory for estimators as well as standard and non-standard test statistics. The derived asymptotic results prove to be new compared to results found elsewhere in the literature due to the impact of the estimated...... symmetric non-linear error correction considered. A simulation study shows that the fi…nite sample properties of the bootstrapped tests are satisfactory with good size and power properties for reasonable sample sizes....

  4. Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbek, Anders

    In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... of non-stationary non-linear time series models. Thus the paper provides a full asymptotic theory for estimators as well as standard and non-standard test statistics. The derived asymptotic results prove to be new compared to results found elsewhere in the literature due to the impact of the estimated...... symmetric non-linear error correction are considered. A simulation study shows that the finite sample properties of the bootstrapped tests are satisfactory with good size and power properties for reasonable sample sizes....

  5. A Generic Danish Distribution Grid Model for Smart Grid Technology Testing

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Wu, Qiuwei; Østergaard, Jacob

    2012-01-01

    This paper describes the development of a generic Danish distribution grid model for smart grid technology testing based on the Bornholm power system. The frequency dependent network equivalent (FDNE) method has been used in order to accurately preserve the desired properties and characteristics...... as a generic Smart Grid benchmark model for testing purposes....... by comparing the transient response of the original Bornholm power system model and the developed generic model under significant fault conditions. The results clearly show that the equivalent generic distribution grid model retains the dynamic characteristics of the original system, and can be used...

  6. Creating a simulation model of software testing using Simulink package

    Directory of Open Access Journals (Sweden)

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  7. 46 CFR 154.449 - Model test.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Model test. 154.449 Section 154.449 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS FOR SELF... § 154.449 Model test. The following analyzed data of a model test of structural elements for independent...

  8. Vibration tests and analyses of the reactor building model on a small scale

    International Nuclear Information System (INIS)

    Tsuchiya, Hideo; Tanaka, Mitsuru; Ogihara, Yukio; Moriyama, Ken-ichi; Nakayama, Masaaki

    1985-01-01

    The purpose of this paper is to describe the vibration tests and the simulation analyses of the reactor building model on a small scale. The model vibration tests were performed to investigate the vibrational characteristics of the combined super-structure and to verify the computor code based on Dr. H. Tajimi's Thin Layered Element Theory, using the uniaxial shaking table (60 cm x 60 cm). The specimens consist of ground model, three structural model (prestressed concrete containment vessel, inner concrete structure, and enclosure building), a combined structural model and a combined structure-soil interaction model. These models are made of silicon-rubber, and they have a scale of 1:600. Harmonic step by step excitation of 40 gals was performed to investigate the vibrational characteristics for each structural model. The responses of the specimen to harmonic excitation were measured by optical displacement meters, and analyzed by a real time spectrum analyzer. The resonance and phase lag curves of the specimens to the shaking table were obtained respectively. As for the tests of a combined structure-soil interaction model, three predominant frequencies were observed in the resonance curves. These values were in good agreement with the analytical transfer function curves on the computer code. From the vibration tests and the simulation analyses, the silicon-rubber model test is useful for the fundamental study of structural problems. The computer code based on the Thin Element Theory can simulate well the test results. (Kobozono, M.)

  9. A wave model test bed study for wave energy resource characterization

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhaoqing; Neary, Vincent S.; Wang, Taiping; Gunawan, Budi; Dallman, Annie R.; Wu, Wei-Cheng

    2017-12-01

    This paper presents a test bed study conducted to evaluate best practices in wave modeling to characterize energy resources. The model test bed off the central Oregon Coast was selected because of the high wave energy and available measured data at the site. Two third-generation spectral wave models, SWAN and WWIII, were evaluated. A four-level nested-grid approach—from global to test bed scale—was employed. Model skills were assessed using a set of model performance metrics based on comparing six simulated wave resource parameters to observations from a wave buoy inside the test bed. Both WWIII and SWAN performed well at the test bed site and exhibited similar modeling skills. The ST4 package with WWIII, which represents better physics for wave growth and dissipation, out-performed ST2 physics and improved wave power density and significant wave height predictions. However, ST4 physics tended to overpredict the wave energy period. The newly developed ST6 physics did not improve the overall model skill for predicting the six wave resource parameters. Sensitivity analysis using different wave frequencies and direction resolutions indicated the model results were not sensitive to spectral resolutions at the test bed site, likely due to the absence of complex bathymetric and geometric features.

  10. Automatic Generation of Test Cases from UML Models

    Directory of Open Access Journals (Sweden)

    Constanza Pérez

    2018-04-01

    Full Text Available [Context] The growing demand for high-quality software has caused the industry to incorporate processes to enable them to comply with these standards, but increasing the cost of development. A strategy to reduce this cost is to incorporate quality evaluations from early stages of software development. A technique that facilitates this evaluation is the model-based testing, which allows to generate test cases at early phases using as input the conceptual models of the system. [Objective] In this paper, we introduce TCGen, a tool that enables the automatic generation of abstract test cases starting from UML conceptual models. [Method] The design and implementation of TCGen, a technique that applies different testing criteria to class diagrams and state transition diagrams to generates test cases, is presented as a model-based testing approach. To do that, TCGen uses UML models, which are widely used at industry and a set of algorithms that recognize the concepts in the models in order to generate abstract test cases. [Results] An exploratory experimental evaluation has been performed to compare the TCGen tool with traditional testing. [Conclusions] Even though the exploratory evaluation shows promising results, it is necessary to perform more empirical evaluations in order to generalize the results. Abstract (in Spanish: [Contexto] La creciente demanda de software de alta calidad ha provocado que la industria incorpore procesos para permitirles cumplir con estos estándares, pero aumentando el costo del desarrollo. Una estrategia para reducir este costo es incorporar evaluaciones de calidad desde las primeras etapas del desarrollo del software. Una técnica que facilita esta evaluación es la prueba basada en modelos, que permite generar casos de prueba en fases tempranas utilizando como entrada los modelos conceptuales del sistema. [Objetivo] En este artículo, presentamos TCGen, una herramienta que permite la generación automática de casos de

  11. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  12. Model-based testing for space-time interaction using point processes: An application to psychiatric hospital admissions in an urban area.

    Science.gov (United States)

    Meyer, Sebastian; Warnke, Ingeborg; Rössler, Wulf; Held, Leonhard

    2016-05-01

    Spatio-temporal interaction is inherent to cases of infectious diseases and occurrences of earthquakes, whereas the spread of other events, such as cancer or crime, is less evident. Statistical significance tests of space-time clustering usually assess the correlation between the spatial and temporal (transformed) distances of the events. Although appealing through simplicity, these classical tests do not adjust for the underlying population nor can they account for a distance decay of interaction. We propose to use the framework of an endemic-epidemic point process model to jointly estimate a background event rate explained by seasonal and areal characteristics, as well as a superposed epidemic component representing the hypothesis of interest. We illustrate this new model-based test for space-time interaction by analysing psychiatric inpatient admissions in Zurich, Switzerland (2007-2012). Several socio-economic factors were found to be associated with the admission rate, but there was no evidence of general clustering of the cases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models

    Science.gov (United States)

    Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock

    2017-05-01

    Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.

  14. Realistic evaluation of tester exposure based on Florida testing experience

    International Nuclear Information System (INIS)

    Schreiber, R.A.

    1990-01-01

    This paper reports on a radon decay product exposure model for Florida Certified Radon Measurement Technicians that has been formulated based on the guidance of 10CFR20. This model was used to estimate the exposure of 44 Florida measurement technicians from January through November of 1989. Comparing estimated testing and home exposure shows that 100% of the technicians observed received more exposure in the home than during testing activities. Exposure during normal office hours also exceed testing exposure in 86% of the technicians observed. Health and safety exposure data for radon measurement technicians does not follow the standard concepts of occupational radiation exposure normally accepted in 10CFR20

  15. 46 CFR 154.431 - Model test.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Model test. 154.431 Section 154.431 Shipping COAST GUARD... Model test. (a) The primary and secondary barrier of a membrane tank, including the corners and joints...(c). (b) Analyzed data of a model test for the primary and secondary barrier of the membrane tank...

  16. Laboratory test of an APS-based sun sensor prototype

    Science.gov (United States)

    Rufino, Giancarlo; Perrotta, Alessandro; Grassi, Michele

    2017-11-01

    This paper deals with design and prototype development of an Active Pixel Sensor - based miniature sun sensor and a laboratory facility for its indoor test and calibration. The miniature sun sensor is described and the laboratory test facility is presented in detail. The major focus of the paper is on tests and calibration of the sensor. Two different calibration functions have been adopted. They are based, respectively, on a geometrical model, which has required least-squares optimisation of system physical parameters estimates, and on neural networks. Calibration results are presented for the above solutions, showing that accuracy in the order of 0.01° has been achieved. Neural calibration functions have attained better performance thanks to their intrinsic auto-adaptive structure.

  17. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  18. Quantitative Assessment of Optical Coherence Tomography Imaging Performance with Phantom-Based Test Methods And Computational Modeling

    Science.gov (United States)

    Agrawal, Anant

    Optical coherence tomography (OCT) is a powerful medical imaging modality that uniquely produces high-resolution cross-sectional images of tissue using low energy light. Its clinical applications and technological capabilities have grown substantially since its invention about twenty years ago, but efforts have been limited to develop tools to assess performance of OCT devices with respect to the quality and content of acquired images. Such tools are important to ensure information derived from OCT signals and images is accurate and consistent, in order to support further technology development, promote standardization, and benefit public health. The research in this dissertation investigates new physical and computational models which can provide unique insights into specific performance characteristics of OCT devices. Physical models, known as phantoms, are fabricated and evaluated in the interest of establishing standardized test methods to measure several important quantities relevant to image quality. (1) Spatial resolution is measured with a nanoparticle-embedded phantom and model eye which together yield the point spread function under conditions where OCT is commonly used. (2) A multi-layered phantom is constructed to measure the contrast transfer function along the axis of light propagation, relevant for cross-sectional imaging capabilities. (3) Existing and new methods to determine device sensitivity are examined and compared, to better understand the detection limits of OCT. A novel computational model based on the finite-difference time-domain (FDTD) method, which simulates the physics of light behavior at the sub-microscopic level within complex, heterogeneous media, is developed to probe device and tissue characteristics influencing the information content of an OCT image. This model is first tested in simple geometric configurations to understand its accuracy and limitations, then a highly realistic representation of a biological cell, the retinal

  19. Embracing model-based designs for dose-finding trials.

    Science.gov (United States)

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-07-25

    Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators' preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia.

  20. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  1. Meaning-Based Scoring: A Systemic Functional Linguistics Model for Automated Test Tasks

    Science.gov (United States)

    Gleason, Jesse

    2014-01-01

    Communicative approaches to language teaching that emphasize the importance of speaking (e.g., task-based language teaching) require innovative and evidence-based means of assessing oral language. Nonetheless, research has yet to produce an adequate assessment model for oral language (Chun 2006; Downey et al. 2008). Limited by automatic speech…

  2. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  3. Numerical modelling of concentrated leak erosion during Hole Erosion Tests

    OpenAIRE

    Mercier, F.; Bonelli, S.; Golay, F.; Anselmet, F.; Philippe, P.; Borghi, R.

    2015-01-01

    This study focuses on the numerical modelling of concentrated leak erosion of a cohesive soil by a turbulent flow in axisymmetrical geometry, with application to the Hole Erosion Test (HET). The numerical model is based on adaptive remeshing of the water/soil interface to ensure accurate description of the mechanical phenomena occurring near the soil/water interface. The erosion law governing the interface motion is based on two erosion parameters: the critical shear stress and the erosion co...

  4. Statistical Tests for Mixed Linear Models

    CERN Document Server

    Khuri, André I; Sinha, Bimal K

    2011-01-01

    An advanced discussion of linear models with mixed or random effects. In recent years a breakthrough has occurred in our ability to draw inferences from exact and optimum tests of variance component models, generating much research activity that relies on linear models with mixed and random effects. This volume covers the most important research of the past decade as well as the latest developments in hypothesis testing. It compiles all currently available results in the area of exact and optimum tests for variance component models and offers the only comprehensive treatment for these models a

  5. A 'Turing' Test for Landscape Evolution Models

    Science.gov (United States)

    Parsons, A. J.; Wise, S. M.; Wainwright, J.; Swift, D. A.

    2008-12-01

    Resolving the interactions among tectonics, climate and surface processes at long timescales has benefited from the development of computer models of landscape evolution. However, testing these Landscape Evolution Models (LEMs) has been piecemeal and partial. We argue that a more systematic approach is required. What is needed is a test that will establish how 'realistic' an LEM is and thus the extent to which its predictions may be trusted. We propose a test based upon the Turing Test of artificial intelligence as a way forward. In 1950 Alan Turing posed the question of whether a machine could think. Rather than attempt to address the question directly he proposed a test in which an interrogator asked questions of a person and a machine, with no means of telling which was which. If the machine's answer could not be distinguished from those of the human, the machine could be said to demonstrate artificial intelligence. By analogy, if an LEM cannot be distinguished from a real landscape it can be deemed to be realistic. The Turing test of intelligence is a test of the way in which a computer behaves. The analogy in the case of an LEM is that it should show realistic behaviour in terms of form and process, both at a given moment in time (punctual) and in the way both form and process evolve over time (dynamic). For some of these behaviours, tests already exist. For example there are numerous morphometric tests of punctual form and measurements of punctual process. The test discussed in this paper provides new ways of assessing dynamic behaviour of an LEM over realistically long timescales. However challenges remain in developing an appropriate suite of challenging tests, in applying these tests to current LEMs and in developing LEMs that pass them.

  6. The Linear Logistic Test Model (LLTM as the methodological foundation of item generating rules for a new verbal reasoning test

    Directory of Open Access Journals (Sweden)

    HERBERT POINSTINGL

    2009-06-01

    Full Text Available Based on the demand for new verbal reasoning tests to enrich psychological test inventory, a pilot version of a new test was analysed: the 'Family Relation Reasoning Test' (FRRT; Poinstingl, Kubinger, Skoda & Schechtner, forthcoming, in which several basic cognitive operations (logical rules have been embedded/implemented. Given family relationships of varying complexity embedded in short stories, testees had to logically conclude the correct relationship between two individuals within a family. Using empirical data, the linear logistic test model (LLTM; Fischer, 1972, a special case of the Rasch model, was used to test the construct validity of the test: The hypothetically assumed basic cognitive operations had to explain the Rasch model's item difficulty parameters. After being shaped in LLTM's matrices of weights ((qij, none of these operations were corroborated by means of the Andersen's Likelihood Ratio Test.

  7. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    Science.gov (United States)

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  8. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Yu, Shengpeng; Cheng, Mengyun; Song, Jing; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  9. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  10. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development and Performance Analysis

    Science.gov (United States)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.

    2014-01-01

    The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.

  11. Dopamine selectively remediates 'model-based' reward learning: a computational approach.

    Science.gov (United States)

    Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna

    2016-02-01

    Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  13. Polynomial model inversion control: numerical tests and applications

    OpenAIRE

    Novara, Carlo

    2015-01-01

    A novel control design approach for general nonlinear systems is described in this paper. The approach is based on the identification of a polynomial model of the system to control and on the on-line inversion of this model. Extensive simulations are carried out to test the numerical efficiency of the approach. Numerical examples of applicative interest are presented, concerned with control of the Duffing oscillator, control of a robot manipulator and insulin regulation in a type 1 diabetic p...

  14. Multi-loop PWR modeling and hardware-in-the-loop testing using ACSL

    International Nuclear Information System (INIS)

    Thomas, V.M.; Heibel, M.D.; Catullo, W.J.

    1989-01-01

    Westinghouse has developed an Advanced Digital Feedwater Control System (ADFCS) which is aimed at reducing feedwater related reactor trips through improved control performance for pressurized water reactor (PWR) power plants. To support control system setpoint studies and functional design efforts for the ADFCS, an ACSL based model of the nuclear steam supply system (NSSS) of a Westinghouse (PWR) was generated. Use of this plant model has been extended from system design to system testing through integration of the model into a Hardware-in-Loop test environment for the ADFCS. This integration includes appropriate interfacing between a Gould SEL 32/87 computer, upon which the plant model executes in real time, and the Westinghouse Distributed Processing family (WDPF) test hardware. A development program has been undertaken to expand the existing ACSL model to include capability to explicitly model multiple plant loops, steam generators, and corresponding feedwater systems. Furthermore, the program expands the ADFCS Hardware-in-Loop testing to include the multi-loop plant model. This paper provides an overview of the testing approach utilized for the ADFCS with focus on the role of Hardware-in-Loop testing. Background on the plant model, methodology and test environment is also provided. Finally, an overview is presented of the program to expand the model and associated Hardware-in-Loop test environment to handle multiple loops

  15. Modeling and simulation for microelectronic packaging assembly manufacturing, reliability and testing

    CERN Document Server

    Liu, Sheng

    2011-01-01

    Although there is increasing need for modeling and simulation in the IC package design phase, most assembly processes and various reliability tests are still based on the time consuming ""test and try out"" method to obtain the best solution. Modeling and simulation can easily ensure virtual Design of Experiments (DoE) to achieve the optimal solution. This has greatly reduced the cost and production time, especially for new product development. Using modeling and simulation will become increasingly necessary for future advances in 3D package development.  In this book, Liu and Liu allow people

  16. Simulation of thermohydraulic phenomena and model test for FBR

    International Nuclear Information System (INIS)

    Satoh, Kazuziro

    1994-01-01

    This paper summarizes the major thermohydraulic phenomena of FBRs and the conventional ways of their model tests, and introduces the recent findings regarding measurement technology and computational science. In the future commercial stage of FBRs, the design optimization will becomes important to improve economy and safety more and more. It is indispensable to use computational science to the plant design and safety evaluation. The most of the model tests will be replaced by the simulation analyses based on computational science. The measurement technology using ultrasonic and the numerical simulation with super parallel computing are considered to be the key technology to realize the design by analysis method. (author)

  17. Modeling the fluid/soil interface erosion in the Hole Erosion Test

    Directory of Open Access Journals (Sweden)

    Kissi B.

    2012-07-01

    Full Text Available Soil erosion is a complex phenomenon which yields at its final stage to insidious fluid leakages under the hydraulic infrastructures known as piping and which are the main cause of their rupture. The Hole Erosion Test is commonly used to quantify the rate of piping erosion. In this work, The Hole Erosion Test is modelled by using Fluent software package. The aim is to predict the erosion rate of soil during the hole erosion test. The renormalization group theory – based k–ε turbulence model equations are used. This modelling makes it possible describing the effect of the clay concentration in flowing water on erosion. Unlike the usual one dimensional models, the proposed modelling shows that erosion is not uniform erosion along the hole length. In particular, the concentration of clay is found to increase noticeably the erosion rate.

  18. Methods and Models of Market Risk Stress-Testing of the Portfolio of Financial Instruments

    Directory of Open Access Journals (Sweden)

    Alexander M. Karminsky

    2015-01-01

    Full Text Available Amid instability of financial markets and macroeconomic situation the necessity of improving bank risk-management instrument arises. New economic reality defines the need for searching for more advanced approaches of estimating banks vulnerability to exceptional, but plausible events. Stress-testing belongs to such instruments. The paper reviews and compares the models of market risk stress-testing of the portfolio of different financial instruments. These days the topic of the paper is highly acute due to the fact that now stress-testing is becoming an integral part of anticrisis risk-management amid macroeconomic instability and appearance of new risks together with close interest to the problem of risk-aggregation. The paper outlines the notion of stress-testing and gives coverage of goals, functions of stress-tests and main criteria for market risk stress-testing classification. The paper also stresses special aspects of scenario analysis. Novelty of the research is explained by elaborating the programme of aggregated complex multifactor stress-testing of the portfolio risk based on scenario analysis. The paper highlights modern Russian and foreign models of stress-testing both on solo-basis and complex. The paper lays emphasis on the results of stress-testing and revaluations of positions for all three complex models: methodology of the Central Bank of stress-testing portfolio risk, model relying on correlations analysis and copula model. The models of stress-testing on solo-basis are different for each financial instrument. Parametric StressVaR model is applicable to shares and options stress-testing;model based on "Grek" indicators is used for options; for euroobligation regional factor model is used. Finally some theoretical recommendations about managing market risk of the portfolio are given.

  19. Reliable Prediction of Insulin Resistance by a School-Based Fitness Test in Middle-School Children

    Directory of Open Access Journals (Sweden)

    Todd Varness

    2009-01-01

    Full Text Available Objectives. (1 Determine the predictive value of a school-based test of cardiovascular fitness (CVF for insulin resistance (IR; (2 compare a “school-based” prediction of IR to a “laboratory-based” prediction, using various measures of fitness and body composition. Methods. Middle school children (n=82 performed the Progressive Aerobic Cardiovascular Endurance Run (PACER, a school-based CVF test, and underwent evaluation of maximal oxygen consumption treadmill testing (VO2 max, body composition (percent body fat and BMI z score, and IR (derived homeostasis model assessment index [HOMAIR]. Results. PACER showed a strong correlation with VO2 max/kg (rs = 0.83, P<.001 and with HOMAIR (rs = −0.60, P<.001. Multivariate regression analysis revealed that a school-based model (using PACER and BMI z score predicted IR similar to a laboratory-based model (using VO2 max/kg of lean body mass and percent body fat. Conclusions. The PACER is a valid school-based test of CVF, is predictive of IR, and has a similar relationship to IR when compared to complex laboratory-based testing. Simple school-based measures of childhood fitness (PACER and fatness (BMI z score could be used to identify childhood risk for IR and evaluate interventions.

  20. Analysis of UPTF downcomer tests with the Cathare multi-dimensional model

    International Nuclear Information System (INIS)

    Dor, I.

    1993-01-01

    This paper presents the analysis and the modelling - with the system code CATHARE - of UPTF downcomer refill tests simulating the refill phase of a large break LOCA. The modelling approach in a system code is discussed. First the reasons why in this particular case available flooding correlations are difficult to use in system code are developed. Then the use of a 1 - D modelling of the downcomer with specific closure relations for the annular geometry is examined. But UPTF 1:1 scale tests and CREARE reduced scale tests point out some weaknesses of this modelling due to the particular multi-dimensional nature of the flow in the upper part of the downcomer. Thus a 2-D model is elaborated and implemented into CATHARE version 1.3e code. The assessment of the model is based on UPTF 1:1 scale tests (saturated and subcooled conditions). Discretization and meshing influence are investigated. On the basis of saturated tests a new discretization is proposed for different terms of the momentum balance equations (interfacial friction, momentum transport terms) which results in a significant improvement. Sensitivity studies performed on subcooled tests show that the water downflow predictions are improved by increasing the condensation in the downcomer. (author). 8 figs., 5 tabs., 9 refs., 2 appendix

  1. SPSS and SAS programming for the testing of mediation models.

    Science.gov (United States)

    Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S

    2004-01-01

    Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.

  2. Creating a Test Validated Structural Dynamic Finite Element Model of the Multi-Utility Technology Test Bed Aircraft

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2014-01-01

    Small modeling errors in the finite element model will eventually induce errors in the structural flexibility and mass, thus propagating into unpredictable errors in the unsteady aerodynamics and the control law design. One of the primary objectives of Multi Utility Technology Test Bed, X-56A, aircraft is the flight demonstration of active flutter suppression, and therefore in this study, the identification of the primary and secondary modes for the structural model tuning based on the flutter analysis of X-56A. The ground vibration test validated structural dynamic finite element model of the X-56A is created in this study. The structural dynamic finite element model of the X-56A is improved using a model tuning tool. In this study, two different weight configurations of the X-56A have been improved in a single optimization run.

  3. Application of Z-Number Based Modeling in Psychological Research

    Directory of Open Access Journals (Sweden)

    Rafik Aliev

    2015-01-01

    Full Text Available Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS, Test of Attention (D2 Test, and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  4. Test facility TIMO for testing the ITER model cryopump

    International Nuclear Information System (INIS)

    Haas, H.; Day, C.; Mack, A.; Methe, S.; Boissin, J.C.; Schummer, P.; Murdoch, D.K.

    2001-01-01

    Within the framework of the European Fusion Technology Programme, FZK is involved in the research and development process for a vacuum pump system of a future fusion reactor. As a result of these activities, the concept and the necessary requirements for the primary vacuum system of the ITER fusion reactor were defined. Continuing that development process, FZK has been preparing the test facility TIMO (Test facility for ITER Model pump) since 1996. This test facility provides for testing a cryopump all needed infrastructure as for example a process gas supply including a metering system, a test vessel, the cryogenic supply for the different temperature levels and a gas analysing system. For manufacturing the ITER model pump an order was given to the company L' Air Liquide in the form of a NET contract. (author)

  5. Test facility TIMO for testing the ITER model cryopump

    International Nuclear Information System (INIS)

    Haas, H.; Day, C.; Mack, A.; Methe, S.; Boissin, J.C.; Schummer, P.; Murdoch, D.K.

    1999-01-01

    Within the framework of the European Fusion Technology Programme, FZK is involved in the research and development process for a vacuum pump system of a future fusion reactor. As a result of these activities, the concept and the necessary requirements for the primary vacuum system of the ITER fusion reactor were defined. Continuing that development process, FZK has been preparing the test facility TIMO (Test facility for ITER Model pump) since 1996. This test facility provides for testing a cryopump all needed infrastructure as for example a process gas supply including a metering system, a test vessel, the cryogenic supply for the different temperature levels and a gas analysing system. For manufacturing the ITER model pump an order was given to the company L'Air Liquide in the form of a NET contract. (author)

  6. Agent-Based Modeling for Testing and Designing Novel Decentralized Command and Control System Paradigms

    National Research Council Canada - National Science Library

    Bonabeau, Eric; Hunt, Carl W; Gaudiano, Paolo

    2003-01-01

    Agent-based modeling (ABM) is a recent simulation modeling technique that consists of modeling a system from the bottom up, capturing the interactions taking place between the system's constituent units...

  7. HEV Test Bench Based on CAN Bus Sensor Communication

    Directory of Open Access Journals (Sweden)

    Shupeng ZHAO

    2014-02-01

    Full Text Available The HEV test bench based on Controller Area Network bus was studied and developed. Control system of HEV power test bench used the CAN bus technology. The application of CAN bus technology on control system development has opened up a new research direction for domestic automobile experimental platform. The HEV power control system development work was completed, including power master controller, electric throttle controller, driving simulation platform, CAN2.0 B communication protocol procedures for formulation, CAN communication monitoring system, the simulation model based on MATLAB code automatic generation technology research, etc. Maximum absorption power of the test bench is 90 kW, the test bench top speed is 6000 r/min, the CAN communication data baud rate is 10~500 k, the conventional electric measurement parameter part precision satisfies the requirement of development of HEV. On the HEV test bench the result of regenerative braking experiment shows that the result got by the test bench was closer to the results got by outdoor road test. And the fuel consumption experiment test results show that the HEV fuel consumption and the charge-discharge character are in linear relationship. The establishment of the test platform for the evaluation of the development of hybrid electric vehicle and power provides physical simulation and test platform.

  8. 1:50 Scale Testing of Three Floating Wind Turbines at MARIN and Numerical Model Validation Against Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Dagher, Habib [Univ. of Maine, Orno, ME (United States); Viselli, Anthony [Univ. of Maine, Orno, ME (United States); Goupee, Andrew [Univ. of Maine, Orno, ME (United States); Allen, Christopher [Univ. of Maine, Orno, ME (United States)

    2017-08-15

    The primary goal of the basin model test program discussed herein is to properly scale and accurately capture physical data of the rigid body motions, accelerations and loads for different floating wind turbine platform technologies. The intended use for this data is for performing comparisons with predictions from various aero-hydro-servo-elastic floating wind turbine simulators for calibration and validation. Of particular interest is validating the floating offshore wind turbine simulation capabilities of NREL’s FAST open-source simulation tool. Once the validation process is complete, coupled simulators such as FAST can be used with a much greater degree of confidence in design processes for commercial development of floating offshore wind turbines. The test program subsequently described in this report was performed at MARIN (Maritime Research Institute Netherlands) in Wageningen, the Netherlands. The models considered consisted of the horizontal axis, NREL 5 MW Reference Wind Turbine (Jonkman et al., 2009) with a flexible tower affixed atop three distinct platforms: a tension leg platform (TLP), a spar-buoy modeled after the OC3 Hywind (Jonkman, 2010) and a semi-submersible. The three generic platform designs were intended to cover the spectrum of currently investigated concepts, each based on proven floating offshore structure technology. The models were tested under Froude scale wind and wave loads. The high-quality wind environments, unique to these tests, were realized in the offshore basin via a novel wind machine which exhibits negligible swirl and low turbulence intensity in the flow field. Recorded data from the floating wind turbine models included rotor torque and position, tower top and base forces and moments, mooring line tensions, six-axis platform motions and accelerations at key locations on the nacelle, tower, and platform. A large number of tests were performed ranging from simple free-decay tests to complex operating conditions with

  9. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  10. Construction and Testing of a 21 GHz Ceramic Based Power Extractor

    CERN Document Server

    Newsham, D; Carron, G; Döbert, Steffen; Gai, W; Konecny, R; Liu, W; Smirnov, A Yu; Thorndahl, L; Wilson, Ian H; Wuensch, Walter; Yu, D

    2003-01-01

    A ceramic based power extractor [1] operating at 21 GHz was built by DULY Research Inc. and tested at CTF2, the CERN Linear Collider (CLIC) Test Facility. The structure includes a ceramic extractor section, a 2-output-port, circular-to-rectangular waveguide coupler, and a 3-port rectangular waveguide combiner that provides for a single output waveguide. Results of cold tests and full beam tests are presented and compared with theoretical and numerical models.

  11. Using the Integrative Model of Behavioral Prediction to Understand College Students' STI Testing Beliefs, Intentions, and Behaviors.

    Science.gov (United States)

    Wombacher, Kevin; Dai, Minhao; Matig, Jacob J; Harrington, Nancy Grant

    2018-03-22

    To identify salient behavioral determinants related to STI testing among college students by testing a model based on the integrative model of behavioral (IMBP) prediction. 265 undergraduate students from a large university in the Southeastern US. Formative and survey research to test an IMBP-based model that explores the relationships between determinants and STI testing intention and behavior. Results of path analyses supported a model in which attitudinal beliefs predicted intention and intention predicted behavior. Normative beliefs and behavioral control beliefs were not significant in the model; however, select individual normative and control beliefs were significantly correlated with intention and behavior. Attitudinal beliefs are the strongest predictor of STI testing intention and behavior. Future efforts to increase STI testing rates should identify and target salient attitudinal beliefs.

  12. GIS Modelling of Radionuclide Transport from the Semipalatinsk Test Site

    Science.gov (United States)

    Balakay, L.; Zakarin, E.; Mahura, A.; Baklanov, A.; Sorensen, J. H.

    2009-04-01

    In this study, the software complex GIS-project MigRad (Migration of Radionuclide) was developed, tested and applied for the territory of the Semipalatinsk test site/ polygon (Republic of Kazakhstan), where since 1961, in total 348 underground nuclear explosions were conducted. The MigRad is oriented on integration of large volumes of different information (mapping, ground-based, and satellite-based survey): and also includes modeling on its base local redistribution of radionuclides by precipitation and surface waters and by long-range transport of radioactive aerosols. The existing thermal anomaly on territory of the polygon was investigated in details, and the object-oriented analysis was applied for the studied area. Employing the RUNOFF model, the simulation of radionuclides migration with surface waters was performed. Employing the DERMA model, the simulation of long-term atmospheric transport, dispersion and deposition patterns for cesium was conducted from 3 selected locations (Balapan, Delegen, and Experimental Field). Employing geoinformation technology, the mapping of the of the high temperature zones and epicenters of radioactive aerosols transport for the territory of the test site was carried out with post-processing and integration of modelling results into GIS environment. Contamination levels of pollution due to former nuclear explosions for population and environment of the surrounding polygon territories of Kazakhstan as well as adjacent countries were analyzed and evaluated. The MigRad was designed as instrument for comprehensive analysis of complex territorial processes influenced by former nuclear explosions on the territory of Semipalatinsk test site. It provides possibilities in detailed analyses for (i) extensive cartographic material, remote sensing, and field measurements data collected in different level databases; (ii) radionuclide migration with flows using accumulation and redistribution of soil particles; (iii) thermal anomalies

  13. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  14. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  15. Ares I Scale Model Acoustic Tests Instrumentation for Acoustic and Pressure Measurements

    Science.gov (United States)

    Vargas, Magda B.; Counter, Douglas D.

    2011-01-01

    The Ares I Scale Model Acoustic Test (ASMAT) was a development test performed at the Marshall Space Flight Center (MSFC) East Test Area (ETA) Test Stand 116. The test article included a 5% scale Ares I vehicle model and tower mounted on the Mobile Launcher. Acoustic and pressure data were measured by approximately 200 instruments located throughout the test article. There were four primary ASMAT instrument suites: ignition overpressure (IOP), lift-off acoustics (LOA), ground acoustics (GA), and spatial correlation (SC). Each instrumentation suite incorporated different sensor models which were selected based upon measurement requirements. These requirements included the type of measurement, exposure to the environment, instrumentation check-outs and data acquisition. The sensors were attached to the test article using different mounts and brackets dependent upon the location of the sensor. This presentation addresses the observed effect of the sensors and mounts on the acoustic and pressure measurements.

  16. Automated Functional Testing based on the Navigation of Web Applications

    Directory of Open Access Journals (Sweden)

    Boni García

    2011-08-01

    Full Text Available Web applications are becoming more and more complex. Testing such applications is an intricate hard and time-consuming activity. Therefore, testing is often poorly performed or skipped by practitioners. Test automation can help to avoid this situation. Hence, this paper presents a novel approach to perform automated software testing for web applications based on its navigation. On the one hand, web navigation is the process of traversing a web application using a browser. On the other hand, functional requirements are actions that an application must do. Therefore, the evaluation of the correct navigation of web applications results in the assessment of the specified functional requirements. The proposed method to perform the automation is done in four levels: test case generation, test data derivation, test case execution, and test case reporting. This method is driven by three kinds of inputs: i UML models; ii Selenium scripts; iii XML files. We have implemented our approach in an open-source testing framework named Automatic Testing Platform. The validation of this work has been carried out by means of a case study, in which the target is a real invoice management system developed using a model-driven approach.

  17. Verifying the functional ability of microstructured surfaces by model-based testing

    Science.gov (United States)

    Hartmann, Wito; Weckenmann, Albert

    2014-09-01

    Micro- and nanotechnology enables the use of new product features such as improved light absorption, self-cleaning or protection, which are based, on the one hand, on the size of functional nanostructures and the other hand, on material-specific properties. With the need to reliably measure progressively smaller geometric features, coordinate and surface-measuring instruments have been refined and now allow high-resolution topography and structure measurements down to the sub-nanometre range. Nevertheless, in many cases it is not possible to make a clear statement about the functional ability of the workpiece or its topography because conventional concepts of dimensioning and tolerancing are solely geometry oriented and standardized surface parameters are not sufficient to consider interaction with non-geometric parameters, which are dominant for functions such as sliding, wetting, sealing and optical reflection. To verify the functional ability of microstructured surfaces, a method was developed based on a parameterized mathematical-physical model of the function. From this model, function-related properties can be identified and geometric parameters can be derived, which may be different for the manufacturing and verification processes. With this method it is possible to optimize the definition of the shape of the workpiece regarding the intended function by applying theoretical and experimental knowledge, as well as modelling and simulation. Advantages of this approach will be discussed and demonstrated by the example of a microstructured inking roll.

  18. Model-Based Structural Health Monitoring of Fatigue Damage Test-Bed Specimens

    Science.gov (United States)

    2011-11-15

    the hull welds or notches along component edges are good initial candidates for the hypothetical damage initiation areas. The branching process adds...to it off-center. The base plate and the stiffener plate are rigidly welded by a tungsten inert gas ( TIG ) weld . Three different crack paths...shown in Figure 9(a), an 18 in long stiffener plate has been welded to each of the tested plates with 0.625 in long discrete TIG welds at 5 locations

  19. Multi-objective Search-based Mobile Testing

    OpenAIRE

    Mao, K.

    2017-01-01

    Despite the tremendous popularity of mobile applications, mobile testing still relies heavily on manual testing. This thesis presents mobile test automation approaches based on multi-objective search. We introduce three approaches: Sapienz (for native Android app testing), Octopuz (for hybrid/web JavaScript app testing) and Polariz (for using crowdsourcing to support search-based mobile testing). These three approaches represent the primary scientific and technical contributions of the thesis...

  20. Development of a fault test experimental facility model using Matlab

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez; Moraes, Davi Almeida, E-mail: martinez@ipen.br, E-mail: dmoraes@dk8.com.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The Fault Test Experimental Facility was developed to simulate a PWR nuclear power plant and is instrumented with temperature, level and pressure sensors. The Fault Test Experimental Facility can be operated to generate normal and fault data, and these failures can be added initially small, and their magnitude being increasing gradually. This work presents the Fault Test Experimental Facility model developed using the Matlab GUIDE (Graphical User Interface Development Environment) toolbox that consists of a set of functions designed to create interfaces in an easy and fast way. The system model is based on the mass and energy inventory balance equations. Physical as well as operational aspects are taken into consideration. The interface layout looks like a process flowchart and the user can set the input variables. Besides the normal operation conditions, there is the possibility to choose a faulty variable from a list. The program also allows the user to set the noise level for the input variables. Using the model, data were generated for different operational conditions, both under normal and fault conditions with different noise levels added to the input variables. Data generated by the model will be compared with Fault Test Experimental Facility data. The Fault Test Experimental Facility theoretical model results will be used for the development of a Monitoring and Fault Detection System. (author)

  1. Development of a fault test experimental facility model using Matlab

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Moraes, Davi Almeida

    2015-01-01

    The Fault Test Experimental Facility was developed to simulate a PWR nuclear power plant and is instrumented with temperature, level and pressure sensors. The Fault Test Experimental Facility can be operated to generate normal and fault data, and these failures can be added initially small, and their magnitude being increasing gradually. This work presents the Fault Test Experimental Facility model developed using the Matlab GUIDE (Graphical User Interface Development Environment) toolbox that consists of a set of functions designed to create interfaces in an easy and fast way. The system model is based on the mass and energy inventory balance equations. Physical as well as operational aspects are taken into consideration. The interface layout looks like a process flowchart and the user can set the input variables. Besides the normal operation conditions, there is the possibility to choose a faulty variable from a list. The program also allows the user to set the noise level for the input variables. Using the model, data were generated for different operational conditions, both under normal and fault conditions with different noise levels added to the input variables. Data generated by the model will be compared with Fault Test Experimental Facility data. The Fault Test Experimental Facility theoretical model results will be used for the development of a Monitoring and Fault Detection System. (author)

  2. Empirical component model to predict the overall performance of heating coils: Calibrations and tests based on manufacturer catalogue data

    International Nuclear Information System (INIS)

    Ruivo, Celestino R.; Angrisani, Giovanni

    2015-01-01

    Highlights: • An empirical model for predicting the performance of heating coils is presented. • Low and high heating capacity cases are used for calibration. • Versions based on several effectiveness correlations are tested. • Catalogue data are considered in approach testing. • The approach is a suitable component model to be used in dynamic simulation tools. - Abstract: A simplified methodology for predicting the overall behaviour of heating coils is presented in this paper. The coil performance is predicted by the ε-NTU method. Usually manufacturers do not provide information about the overall thermal resistance or the geometric details that are required either for the device selection or to apply known empirical correlations for the estimation of the involved thermal resistances. In the present work, heating capacity tables from the manufacturer catalogue are used to calibrate simplified approaches based on the classical theory of heat exchangers, namely the effectiveness method. Only two reference operating cases are required to calibrate each approach. The validity of the simplified approaches is investigated for a relatively high number of operating cases, listed in the technical catalogue of a manufacturer. Four types of coils of three sizes of air handling units are considered. A comparison is conducted between the heating coil capacities provided by the methodology and the values given by the manufacturer catalogue. The results show that several of the proposed approaches are suitable component models to be integrated in dynamic simulation tools of air conditioning systems such as TRNSYS or EnergyPlus

  3. Estimation of an Examinee's Ability in the Web-Based Computerized Adaptive Testing Program IRT-CAT

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2006-11-01

    Full Text Available We developed a program to estimate an examinee's ability in order to provide freely available access to a web-based computerized adaptive testing (CAT program. We used PHP and Java Script as the program languages, PostgresSQL as the database management system on an Apache web server and Linux as the operating system. A system which allows for user input and searching within inputted items and creates tests was constructed. We performed an ability estimation on each test based on a Rasch model and 2- or 3-parametric logistic models. Our system provides an algorithm for a web-based CAT, replacing previous personal computer-based ones, and makes it possible to estimate an examinee?占퐏 ability immediately at the end of test.

  4. Cyber-Physical Energy Systems Modeling, Test Specification, and Co-Simulation Based Testing

    DEFF Research Database (Denmark)

    van der Meer, A. A.; Palensky, P.; Heussen, Kai

    2017-01-01

    The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is required....... Taking existing (quasi-) standardised smart grid system and test specification methods as a starting point, we are developing a holistic testing and validation approach that allows a very flexible way of assessing the system level aspects by various types of experiments (including virtual, real......, and mixed lab settings). This paper describes the formal holistic test case specification method and applies it to a particular co-simulation experimental setup. The various building blocks of such a simulation (i.e., FMI, mosaik, domain-specific simulation federates) are covered in more detail...

  5. Results of steel containment vessel model test

    International Nuclear Information System (INIS)

    Luk, V.K.; Ludwigsen, J.S.; Hessheimer, M.F.; Komine, Kuniaki; Matsumoto, Tomoyuki; Costello, J.F.

    1998-05-01

    A series of static overpressurization tests of scale models of nuclear containment structures is being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the US Nuclear Regulatory Commission. Two tests are being conducted: (1) a test of a model of a steel containment vessel (SCV) and (2) a test of a model of a prestressed concrete containment vessel (PCCV). This paper summarizes the conduct of the high pressure pneumatic test of the SCV model and the results of that test. Results of this test are summarized and are compared with pretest predictions performed by the sponsoring organizations and others who participated in a blind pretest prediction effort. Questions raised by this comparison are identified and plans for posttest analysis are discussed

  6. MARS-LMR modeling for the post-test analysis of Phenix End-of-Life natural circulation

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; Ha, Kwi Seok; Chang, Won Pyo; Lee, Kwi Lim

    2011-01-01

    For a successful design and analysis of Sodium cooled Fast Reactor (SFR), it is required to have a reliable and well-proven system analysis code. To achieve this purpose, KAERI is enhancing the modeling capability of MARS code by adding the SFR-specific models such as pressure drop model, heat transfer model and reactivity feedback model. This version of MARS-LMR will be used as a basic tool in the design and analysis of future SFR systems in Korea. Before wide application of MARS-LMR code, it is required to verify and validate the code models through analyses for appropriate experimental data or analytical results. The end-of-life test of Phenix reactor performed by the CEA provided a unique opportunity to have reliable test data which is very valuable in the validation and verification of a SFR system analysis code. The KAERI joined this international program of the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main test of natural circulation was completed in 2009. Before the test the KAERI performed the pre-test analysis based on the design condition provided by the CEA. Then, the blind post-test analysis was also performed based on the test conditions measured during the test before the CEA provide the final test results. Finally, the final post-test analysis was performed recently to predict the test results as accurate as possible. This paper introduces the modeling approach of the MARS-LMR used in the final post-test analysis and summarizes the major results of the analysis

  7. Adversarial life testing: A Bayesian negotiation model

    International Nuclear Information System (INIS)

    Rufo, M.J.; Martín, J.; Pérez, C.J.

    2014-01-01

    Life testing is a procedure intended for facilitating the process of making decisions in the context of industrial reliability. On the other hand, negotiation is a process of making joint decisions that has one of its main foundations in decision theory. A Bayesian sequential model of negotiation in the context of adversarial life testing is proposed. This model considers a general setting for which a manufacturer offers a product batch to a consumer. It is assumed that the reliability of the product is measured in terms of its lifetime. Furthermore, both the manufacturer and the consumer have to use their own information with respect to the quality of the product. Under these assumptions, two situations can be analyzed. For both of them, the main aim is to accept or reject the product batch based on the product reliability. This topic is related to a reliability demonstration problem. The procedure is applied to a class of distributions that belong to the exponential family. Thus, a unified framework addressing the main topics in the considered Bayesian model is presented. An illustrative example shows that the proposed technique can be easily applied in practice

  8. Centrifuge modeling of one-step outflow tests for unsaturated parameter estimations

    Directory of Open Access Journals (Sweden)

    H. Nakajima

    2006-01-01

    Full Text Available Centrifuge modeling of one-step outflow tests were carried out using a 2-m radius geotechnical centrifuge, and the cumulative outflow and transient pore water pressure were measured during the tests at multiple gravity levels. Based on the scaling laws of centrifuge modeling, the measurements generally showed reasonable agreement with prototype data calculated from forward simulations with input parameters determined from standard laboratory tests. The parameter optimizations were examined for three different combinations of input data sets using the test measurements. Within the gravity level examined in this study up to 40g, the optimized unsaturated parameters compared well when accurate pore water pressure measurements were included along with cumulative outflow as input data. With its capability to implement variety of instrumentations under well controlled initial and boundary conditions and to shorten testing time, the centrifuge modeling technique is attractive as an alternative experimental method that provides more freedom to set inverse problem conditions for the parameter estimation.

  9. Centrifuge modeling of one-step outflow tests for unsaturated parameter estimations

    Science.gov (United States)

    Nakajima, H.; Stadler, A. T.

    2006-10-01

    Centrifuge modeling of one-step outflow tests were carried out using a 2-m radius geotechnical centrifuge, and the cumulative outflow and transient pore water pressure were measured during the tests at multiple gravity levels. Based on the scaling laws of centrifuge modeling, the measurements generally showed reasonable agreement with prototype data calculated from forward simulations with input parameters determined from standard laboratory tests. The parameter optimizations were examined for three different combinations of input data sets using the test measurements. Within the gravity level examined in this study up to 40g, the optimized unsaturated parameters compared well when accurate pore water pressure measurements were included along with cumulative outflow as input data. With its capability to implement variety of instrumentations under well controlled initial and boundary conditions and to shorten testing time, the centrifuge modeling technique is attractive as an alternative experimental method that provides more freedom to set inverse problem conditions for the parameter estimation.

  10. Modelling of the physical behaviour of water saturated clay barriers. Laboratory tests, material models and finite element application

    International Nuclear Information System (INIS)

    Boergesson, L.; Johannesson, L.E.; Sanden, T.; Hernelind, J.

    1995-09-01

    This report deals with laboratory testing and modelling of the thermo-hydro-mechanical (THM) properties of water saturated bentonite based buffer materials. A number of different laboratory tests have been performed and the results are accounted for. These test results have lead to a tentative material model, consisting of several sub-models, which is described in the report. The tentative model has partly been adapted to the material models available in the finite element code ABAQUS and partly been implemented and incorporated in the code. The model that can be used for ABAQUS calculations agrees with the tentative model with a few exceptions. The model has been used in a number of verification calculations, simulating different laboratory tests, and the results have been compared with actual measurements. These calculations show that the model generally can be used for THM calculations of the behaviour of water saturated buffer materials, but also that there is still a lack of some understanding. It is concluded that the available model is relevant for the required predictions of the THM behaviour but that a further improvement of the model is desirable

  11. Delay model and performance testing for FPGA carry chain TDC

    International Nuclear Information System (INIS)

    Kang Xiaowen; Liu Yaqiang; Cui Junjian Yang Zhangcan; Jin Yongjie

    2011-01-01

    Time-of-flight (TOF) information would improve the performance of PET (position emission tomography). TDC design is a key technique. It proposed Carry Chain TDC Delay model. Through changing the significant delay parameter of model, paper compared the difference of TDC performance, and finally realized Time-to-Digital Convertor (TDC) based on Carry Chain Method using FPGA EP2C20Q240C8N with 69 ps LSB, max error below 2 LSB. Such result could meet the TOF demand. It also proposed a Coaxial Cable Measuring method for TDC testing, without High-precision test equipment. (authors)

  12. Not just the norm: exemplar-based models also predict face aftereffects.

    Science.gov (United States)

    Ross, David A; Deroche, Mickael; Palmeri, Thomas J

    2014-02-01

    The face recognition literature has considered two competing accounts of how faces are represented within the visual system: Exemplar-based models assume that faces are represented via their similarity to exemplars of previously experienced faces, while norm-based models assume that faces are represented with respect to their deviation from an average face, or norm. Face identity aftereffects have been taken as compelling evidence in favor of a norm-based account over an exemplar-based account. After a relatively brief period of adaptation to an adaptor face, the perceived identity of a test face is shifted toward a face with attributes opposite to those of the adaptor, suggesting an explicit psychological representation of the norm. Surprisingly, despite near universal recognition that face identity aftereffects imply norm-based coding, there have been no published attempts to simulate the predictions of norm- and exemplar-based models in face adaptation paradigms. Here, we implemented and tested variations of norm and exemplar models. Contrary to common claims, our simulations revealed that both an exemplar-based model and a version of a two-pool norm-based model, but not a traditional norm-based model, predict face identity aftereffects following face adaptation.

  13. Analysis and model testing of Super Tiger Type B packaging in accident environments

    International Nuclear Information System (INIS)

    Yoshimura, H.R.; Romesberg, L.E.; May, R.A.; Joseph, B.J.

    1980-01-01

    Based on previous scale model test results with more rigid systems and the subsystem tests on drums, it is believed that the scaled models realistically replicate full scale system behavior. Future work will be performed to obtain improved stiffness data on the Type A containers. These data will be incorporated into the finite element model, and improved correlation with the test results is expected. Review of the scale model transport system test results indicated that the method of attachment of the Super Tiger to the trailer was the primary cause for detachment of the outer door during the one-eighth scale grade-crossing test. Although the container seal on the scale model of Super Tiger was not adequately modeled to provide a leak-tight seal, loss of the existing seal in a full scale test can be inferred from the results of the one-quarter scale model grade-crossing test. In each test, approximately two-thirds of the model drums were estimated to have deformed sufficiently to predict loss of drum head closure seal, with several partially losing their contents within the overpack. In no case were drums ejected from the overpack, nor was there evidence of material loss in excess of the amount assumed in the WIPP EIS from any of the Super Tiger models tested. 9 figures

  14. Detailed field test of yaw-based wake steering

    DEFF Research Database (Denmark)

    Fleming, P.; Churchfield, M.; Scholbrock, A.

    2016-01-01

    production. In the first phase, a nacelle-mounted scanning lidar was used to verify wake deflection of a misaligned turbine and calibrate wake deflection models. In the second phase, these models were used within a yaw controller to achieve a desired wake deflection. This paper details the experimental......This paper describes a detailed field-test campaign to investigate yaw-based wake steering. In yaw-based wake steering, an upstream turbine intentionally misaligns its yaw with respect to the inflow to deflect its wake away from a downstream turbine, with the goal of increasing total power...... design and setup. All data collected as part of this field experiment will be archived and made available to the public via the U.S. Department of Energy’s Atmosphere to Electrons Data Archive and Portal....

  15. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  16. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Springback study in aluminum alloys based on the Demeri Benchmark Test : influence of material model

    International Nuclear Information System (INIS)

    Greze, R.; Laurent, H.; Manach, P. Y.

    2007-01-01

    Springback is a serious problem in sheet metal forming. Its origin lies in the elastic recovery of materials after a deep drawing operation. Springback modifies the final shape of the part when removed from the die after forming. This study deals with Springback in an Al5754-O aluminum alloy. An experimental test similar to the Demeri Benchmark Test has been developed. The experimentally measured Springback is compared to predicted Springback simulation using Abaqus software. Several material models are analyzed, all models using isotropic hardening of Voce type and plasticity criteria such as Von Mises and Hill48's yield criterion

  18. The Challenge of Forecasting Metropolitan Growth: Urban Characteristics Based Models versus Regional Dummy Based Models

    OpenAIRE

    NA

    2005-01-01

    This paper presents a study of errors in forecasting the population of Metropolitan Statistical Areas and the Primary MSAs of Consolidated Metropolitan Statistical Areas and New England MAs. The forecasts are for the year 2000 and are based on a semi-structural model estimated by Mills and Lubelle using 1970 to 1990 census data on population, employment and relative real wages. This model allows the testing of regional effects on population and employment growth. The year 2000 forecasts are f...

  19. Vibration-Based Damage Diagnosis in a Laboratory Cable-Stayed Bridge Model via an RCP-ARX Model Based Method

    International Nuclear Information System (INIS)

    Michaelides, P G; Apostolellis, P G; Fassois, S D

    2011-01-01

    Vibration-based damage detection and identification in a laboratory cable-stayed bridge model is addressed under inherent, environmental, and experimental uncertainties. The problem is challenging as conventional stochastic methods face difficulties due to uncertainty underestimation. A novel method is formulated based on identified Random Coefficient Pooled ARX (RCP-ARX) representations of the dynamics and statistical hypothesis testing. The method benefits from the ability of RCP models in properly capturing uncertainty. Its effectiveness is demonstrated via a high number of experiments under a variety of damage scenarios.

  20. Vibration-Based Damage Diagnosis in a Laboratory Cable-Stayed Bridge Model via an RCP-ARX Model Based Method

    Energy Technology Data Exchange (ETDEWEB)

    Michaelides, P G; Apostolellis, P G; Fassois, S D, E-mail: mixail@mech.upatras.gr, E-mail: fassois@mech.upatras.gr [Laboratory for Stochastic Mechanical Systems and Automation (SMSA), Department of Mechanical and Aeronautical Engineering, University of Patras, GR 265 00 Patras (Greece)

    2011-07-19

    Vibration-based damage detection and identification in a laboratory cable-stayed bridge model is addressed under inherent, environmental, and experimental uncertainties. The problem is challenging as conventional stochastic methods face difficulties due to uncertainty underestimation. A novel method is formulated based on identified Random Coefficient Pooled ARX (RCP-ARX) representations of the dynamics and statistical hypothesis testing. The method benefits from the ability of RCP models in properly capturing uncertainty. Its effectiveness is demonstrated via a high number of experiments under a variety of damage scenarios.

  1. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir; Shafieloo, Arman [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of); Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr [School of Physics, The University of New South Wales, Sydney NSW 2052 (Australia)

    2017-09-01

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation of the base ΛCDM model as cosmology's gold standard.

  2. Plan on test to failure of a prestressed concrete containment vessel model

    International Nuclear Information System (INIS)

    Takumi, K.; Nonaka, A.; Umeki, K.; Nagata, K.; Soejima, M.; Yamaura, Y.; Costello, J.F.; Riesemann, W.A. von.; Parks, M.B.; Horschel, D.S.

    1992-01-01

    A summary of the plans to test a prestressed concrete containment vessel (PCCV) model to failure is provided in this paper. The test will be conducted as a part of a joint research program between the Nuclear Power Engineering Corporation (NUPEC), the United States Nuclear Regulatory Commission (NRC), and Sandia National Laboratories (SNL). The containment model will be a scaled representation of a PCCV for a pressurized water reactor (PWR). During the test, the model will be slowly pressurized internally until failure of the containment pressure boundary occurs. The objectives of the test are to measure the failure pressure, to observe the mode of failure, and to record the containment structural response up to failure. Pre- and posttest analyses will be conducted to forecast and evaluate the test results. Based on these results, a validated method for evaluating the structural behavior of an actual PWR PCCV will be developed. The concepts to design the PCCV model are also described in the paper

  3. Optimisation of test and maintenance based on probabilistic methods

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper presents a method, which based on models and results of probabilistic safety assessment, minimises the nuclear power plant risk by optimisation of arrangement of safety equipment outages. The test and maintenance activities of the safety equipment are timely arranged, so the classical static fault tree models are extended with the time requirements to be capable to model real plant states. A house event matrix is used, which enables modelling of the equipment arrangements through the discrete points of time. The result of the method is determination of such configuration of equipment outages, which result in the minimal risk. Minimal risk is represented by system unavailability. (authors)

  4. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  5. Model-predictive control based on Takagi-Sugeno fuzzy model for electrical vehicles delayed model

    DEFF Research Database (Denmark)

    Khooban, Mohammad-Hassan; Vafamand, Navid; Niknam, Taher

    2017-01-01

    Electric vehicles (EVs) play a significant role in different applications, such as commuter vehicles and short distance transport applications. This study presents a new structure of model-predictive control based on the Takagi-Sugeno fuzzy model, linear matrix inequalities, and a non......-quadratic Lyapunov function for the speed control of EVs including time-delay states and parameter uncertainty. Experimental data, using the Federal Test Procedure (FTP-75), is applied to test the performance and robustness of the suggested controller in the presence of time-varying parameters. Besides, a comparison...... is made between the results of the suggested robust strategy and those obtained from some of the most recent studies on the same topic, to assess the efficiency of the suggested controller. Finally, the experimental results based on a TMS320F28335 DSP are performed on a direct current motor. Simulation...

  6. Regression-Based Norms for a Bi-factor Model for Scoring the Brief Test of Adult Cognition by Telephone (BTACT).

    Science.gov (United States)

    Gurnani, Ashita S; John, Samantha E; Gavett, Brandon E

    2015-05-01

    The current study developed regression-based normative adjustments for a bi-factor model of the The Brief Test of Adult Cognition by Telephone (BTACT). Archival data from the Midlife Development in the United States-II Cognitive Project were used to develop eight separate linear regression models that predicted bi-factor BTACT scores, accounting for age, education, gender, and occupation-alone and in various combinations. All regression models provided statistically significant fit to the data. A three-predictor regression model fit best and accounted for 32.8% of the variance in the global bi-factor BTACT score. The fit of the regression models was not improved by gender. Eight different regression models are presented to allow the user flexibility in applying demographic corrections to the bi-factor BTACT scores. Occupation corrections, while not widely used, may provide useful demographic adjustments for adult populations or for those individuals who have attained an occupational status not commensurate with expected educational attainment. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Testing a model of codependency for college students in Taiwan based on Bowen's concept of differentiation.

    Science.gov (United States)

    Chang, Shih-Hua

    2018-04-01

    The purpose of this study was to test a model of codependency based on Bowen's concept of differentiation for college students in Taiwan. The relations between family-of-origin dysfunction, differentiation of self, codependency traits and related symptoms including low self-esteem, relationship distress and psychological adjustment problems were examined. Data were collected from 567 college students from 2 large, urban universities in northern Taiwan. Results indicated a significantly negative relationship between levels of codependency and self-differentiation and that self-differentiation partially mediated the relationship between family-of-origin dysfunction and codependency. The implications of these findings for counselling Taiwanese college students who experience codependency traits and related symptoms as well as suggestions for future research are discussed. © 2016 International Union of Psychological Science.

  8. Numerical Model to Quantify the Influence of the Cellulosic Substrate on the Ignition Propensity Tests

    Directory of Open Access Journals (Sweden)

    Guindos Pablo

    2016-07-01

    Full Text Available A numerical model based on the finite element method has been constructed to simulate the ignition propensity (IP tests. The objective of this mathematical model was to quantify the influence of different characteristics of the cellulosic substrate on the results of the IP-tests. The creation and validation of the model included the following steps: (I formulation of the model based on experimental thermodynamic characteristics of the cellulosic substrate; (ii calibration of the model according to cone calorimeter tests; (iii validation of the model through mass loss and temperature profiling during IP-testing. Once the model was validated, the influence of each isolated parameter of the cellulosic substrate was quantified via a parametric study. The results revealed that the substrate heat capacity, the cigarette temperature and the pyrolysis activation energy are the most influencing parameters on the thermodynamic response of the substrates, while other parameters like heat of the pyrolysis reaction, density and roughness of the substrate showed little influence. Also the results indicated that the thermodynamic mechanisms involved in the pyrolysis and combustion of the cellulosic substrate are complex and show low repeatability which might impair the reliability of the IP-tests.

  9. Development of Vehicle Model Test for Road Loading Analysis of Sedan Model

    Science.gov (United States)

    Mohd Nor, M. K.; Noordin, A.; Ruzali, M. F. S.; Hussen, M. H.

    2016-11-01

    Simple Structural Surfaces (SSS) method is offered as a means of organizing the process for rationalizing the basic vehicle body structure load paths. The application of this simplified approach is highly beneficial in the design development of modern passenger car structure especially during the conceptual stage. In Malaysia, however, there is no real physical model of SSS available to gain considerable insight and understanding into the function of each major subassembly in the whole vehicle structures. Based on this motivation, a physical model of SSS for sedan model with the corresponding model vehicle tests of bending and torsion is proposed in this work. The proposed approach is relatively easy to understand as compared to Finite Element Method (FEM). The results show that the proposed vehicle model test is capable to show that satisfactory load paths can give a sufficient structural stiffness within the vehicle structure. It is clearly observed that the global bending stiffness reduce significantly when more panels are removed from a complete SSS model. It is identified that parcel shelf is an important subassembly to sustain bending load. The results also match with the theoretical hypothesis, as the stiffness of the structure in an open section condition is shown weak when subjected to torsion load compared to bending load. The proposed approach can potentially be integrated with FEM to speed up the design process of automotive vehicle.

  10. Performance Model for High-Power Lithium Titanate Oxide Batteries based on Extended Characterization Tests

    DEFF Research Database (Denmark)

    Stroe, Ana-Irina; Swierczynski, Maciej Jozef; Stroe, Daniel Ioan

    2015-01-01

    Lithium-ion (Li-ion) batteries are found nowadays not only in portable/consumer electronics but also in more power demanding applications, such as stationary renewable energy storage, automotive and back-up power supply, because of their superior characteristics in comparison to other energy...... storage technologies. Nevertheless, prior to be used in any of the aforementioned application, a Li-ion battery cell must be intensively characterized and its behavior needs to be understood. This can be realized by performing extended laboratory characterization tests and developing Li-ion battery...... performance models. Furthermore, accurate performance models are necessary in order to analyze the behavior of the battery cell under different mission profiles, by simulation; thus, avoiding time and cost demanding real life tests. This paper presents the development and the parametrization of a performance...

  11. An Outcrop-based Detailed Geological Model to Test Automated Interpretation of Seismic Inversion Results

    NARCIS (Netherlands)

    Feng, R.; Sharma, S.; Luthi, S.M.; Gisolf, A.

    2015-01-01

    Previously, Tetyukhina et al. (2014) developed a geological and petrophysical model based on the Book Cliffs outcrops that contained eight lithotypes. For reservoir modelling purposes, this model is judged to be too coarse because in the same lithotype it contains reservoir and non-reservoir

  12. Estimating and quantifying the impact of using models for integration and testing

    NARCIS (Netherlands)

    Braspenning, N.C.W.M.; Boumen, R.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2011-01-01

    Industrial trends show that the lead time and costs of integrating and testing high-tech multi-disciplinary systems are becoming critical factors for commercial success. In our research, we developed a method for early, model-based integration and testing to reduce this criticality. Although its

  13. Finite element simulation of nanoindentation tests using a macroscopic computational model

    International Nuclear Information System (INIS)

    Khelifa, Mourad; Fierro, Vanessa; Celzard, Alain

    2014-01-01

    The aim of this work was to develop a numerical procedure to simulate nanoindentation tests using a macroscopic computational model. Both theoretical and numerical aspects of the proposed methodology, based on the coupling of isotropic elasticity and anisotropic plasticity described with the quadratic criterion of Hill are presented to model this behaviour. The anisotropic plastic behaviour accounts for the mixed nonlinear hardening (isotropic and kinematic) under large plastic deformation. Nanoindentation tests were simulated to analyse the nonlinear mechanical behaviour of aluminium alloy. The predicted results of the finite element (FE) modelling are in good agreement with the experimental data, thereby confirming the accuracy level of the suggested FE method of analysis. The effects of some technological and mechanical parameters known to have an influence during the nanoindentation tests were also investigated.

  14. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  15. Does the cognitive reflection test measure cognitive reflection? A mathematical modeling approach.

    Science.gov (United States)

    Campitelli, Guillermo; Gerrans, Paul

    2014-04-01

    We used a mathematical modeling approach, based on a sample of 2,019 participants, to better understand what the cognitive reflection test (CRT; Frederick In Journal of Economic Perspectives, 19, 25-42, 2005) measures. This test, which is typically completed in less than 10 min, contains three problems and aims to measure the ability or disposition to resist reporting the response that first comes to mind. However, since the test contains three mathematically based problems, it is possible that the test only measures mathematical abilities, and not cognitive reflection. We found that the models that included an inhibition parameter (i.e., the probability of inhibiting an intuitive response), as well as a mathematical parameter (i.e., the probability of using an adequate mathematical procedure), fitted the data better than a model that only included a mathematical parameter. We also found that the inhibition parameter in males is best explained by both rational thinking ability and the disposition toward actively open-minded thinking, whereas in females this parameter was better explained by rational thinking only. With these findings, this study contributes to the understanding of the processes involved in solving the CRT, and will be particularly useful for researchers who are considering using this test in their research.

  16. Design of a testing strategy using non-animal based test methods: lessons learnt from the ACuteTox project.

    Science.gov (United States)

    Kopp-Schneider, Annette; Prieto, Pilar; Kinsner-Ovaskainen, Agnieszka; Stanzel, Sven

    2013-06-01

    In the framework of toxicology, a testing strategy can be viewed as a series of steps which are taken to come to a final prediction about a characteristic of a compound under study. The testing strategy is performed as a single-step procedure, usually called a test battery, using simultaneously all information collected on different endpoints, or as tiered approach in which a decision tree is followed. Design of a testing strategy involves statistical considerations, such as the development of a statistical prediction model. During the EU FP6 ACuteTox project, several prediction models were proposed on the basis of statistical classification algorithms which we illustrate here. The final choice of testing strategies was not based on statistical considerations alone. However, without thorough statistical evaluations a testing strategy cannot be identified. We present here a number of observations made from the statistical viewpoint which relate to the development of testing strategies. The points we make were derived from problems we had to deal with during the evaluation of this large research project. A central issue during the development of a prediction model is the danger of overfitting. Procedures are presented to deal with this challenge. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Choosing wisely: a model-based analysis evaluating the trade-offs in cancer benefit and diagnostic referrals among alternative HPV testing strategies in Norway.

    Science.gov (United States)

    Burger, Emily A; Pedersen, Kine; Sy, Stephen; Kristiansen, Ivar Sønbø; Kim, Jane J

    2017-09-05

    Forthcoming cervical cancer screening strategies involving human papillomavirus (HPV) testing for women not vaccinated against HPV infections may increase colposcopy referral rates. We quantified health and resource trade-offs associated with alternative HPV-based algorithms to inform decision-makers when choosing between candidate algorithms. We used a mathematical simulation model of HPV-induced cervical carcinogenesis in Norway. We compared the current cytology-based strategy to alternative strategies that varied by the switching age to primary HPV testing (ages 25-34 years), the routine screening frequency (every 3-10 years), and management of HPV-positive, cytology-negative women. Model outcomes included reductions in lifetime cervical cancer risk, relative colposcopy rates, and colposcopy rates per cervical cancer prevented. The age of switching to primary HPV testing and the screening frequency had the largest impacts on cancer risk reductions, which ranged from 90.9% to 96.3% compared to no screening. In contrast, increasing the follow-up intensity of HPV-positive, cytology-negative women provided only minor improvements in cancer benefits, but generally required considerably higher rates of colposcopy referrals compared to current levels, resulting in less efficient cervical cancer prevention. We found that in order to maximise cancer benefits HPV-based screening among unvaccinated women should not be delayed: rather, policy makers should utilise the triage mechanism to control colposcopy referrals.

  18. Hydraulic Model Tests on Modified Wave Dragon

    DEFF Research Database (Denmark)

    Hald, Tue; Lynggaard, Jakob

    A floating model of the Wave Dragon (WD) was built in autumn 1998 by the Danish Maritime Institute in scale 1:50, see Sørensen and Friis-Madsen (1999) for reference. This model was subjected to a series of model tests and subsequent modifications at Aalborg University and in the following...... are found in Hald and Lynggaard (2001). Model tests and reconstruction are carried out during the phase 3 project: ”Wave Dragon. Reconstruction of an existing model in scale 1:50 and sequentiel tests of changes to the model geometry and mass distribution parameters” sponsored by the Danish Energy Agency...

  19. Model-Based Development of Control Systems for Forestry Cranes

    Directory of Open Access Journals (Sweden)

    Pedro La Hera

    2015-01-01

    Full Text Available Model-based methods are used in industry for prototyping concepts based on mathematical models. With our forest industry partners, we have established a model-based workflow for rapid development of motion control systems for forestry cranes. Applying this working method, we can verify control algorithms, both theoretically and practically. This paper is an example of this workflow and presents four topics related to the application of nonlinear control theory. The first topic presents the system of differential equations describing the motion dynamics. The second topic presents nonlinear control laws formulated according to sliding mode control theory. The third topic presents a procedure for model calibration and control tuning that are a prerequisite to realize experimental tests. The fourth topic presents the results of tests performed on an experimental crane specifically equipped for these tasks. Results of these studies show the advantages and disadvantages of these control algorithms, and they highlight their performance in terms of robustness and smoothness.

  20. Risk based test interval and maintenance optimisation - Application and uses

    International Nuclear Information System (INIS)

    Sparre, E.

    1999-10-01

    The project is part of an IAEA co-ordinated Research Project (CRP) on 'Development of Methodologies for Optimisation of Surveillance Testing and Maintenance of Safety Related Equipment at NPPs'. The purpose of the project is to investigate the sensitivity of the results obtained when performing risk based optimisation of the technical specifications. Previous projects have shown that complete LPSA models can be created and that these models allow optimisation of technical specifications. However, these optimisations did not include any in depth check of the result sensitivity with regards to methods, model completeness etc. Four different test intervals have been investigated in this study. Aside from an original, nominal, optimisation a set of sensitivity analyses has been performed and the results from these analyses have been compared to the original optimisation. The analyses indicate that the result of an optimisation is rather stable. However, it is not possible to draw any certain conclusions without performing a number of sensitivity analyses. Significant differences in the optimisation result were discovered when analysing an alternative configuration. Also deterministic uncertainties seem to affect the result of an optimisation largely. The sensitivity of failure data uncertainties is important to investigate in detail since the methodology is based on the assumption that the unavailability of a component is dependent on the length of the test interval

  1. Advances in the application of decision theory to test-based decision making

    NARCIS (Netherlands)

    van der Linden, Willem J.

    This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical

  2. Atmospheric resuspension of radionuclides. Model testing using Chernobyl data

    International Nuclear Information System (INIS)

    Garger, E.; Lev, T.; Talerko, N.; Galeriu, D.; Garland, J.; Hoffman, O.; Nair, S.; Thiessen, K.; Miller, C.; Mueller, H.; Kryshev, A.

    1996-10-01

    Resuspension can be an important secondary source of contamination after a release has stopped, as well as a source of contamination for people and areas not exposed to the original release. The inhalation of resuspended radionuclides contributes to the overall dose received by exposed individuals. Based on measurements collected after the Chernobyl accident, Scenario R was developed to provide an opportunity to test existing mathematical models of contamination resuspension. In particular, this scenario provided the opportunity to examine data and test models for atmospheric resuspension of radionuclides at several different locations from the release, to investigate resuspension processes on both local and regional scales, and to investigate the importance of seasonal variations of these processes. Participants in the test exercise were provided with information for three different types of locations: (1) within the 30-km zone, where local resuspension processes are expected to dominate; (2) a large urban location (Kiev) 120 km from the release site, where vehicular traffic is expected to be the dominant mechanism for resuspension; and (3) an agricultural area 40-60 km from the release site, where highly contaminated upwind 'hot spots' are expected to be important. Input information included characteristics of the ground contamination around specific sites, climatological data for the sites, characteristics of the terrain and topography, and locations of the sampling sites. Participants were requested to predict the average (quarterly and yearly) concentrations of 137 Cs in air at specified locations due to resuspension of Chernobyl fallout; predictions for 90 Sr and 239 + 240 Pu were also requested for one location and time point. Predictions for specified resuspension factors and rates were also requested. Most participants used empirical models for the resuspension factor as a function of time K(t), as opposed to process-based models. While many of these

  3. Atmospheric resuspension of radionuclides. Model testing using Chernobyl data

    Energy Technology Data Exchange (ETDEWEB)

    Garger, E.; Lev, T.; Talerko, N. [Inst. of Radioecology UAAS, Kiev (Ukraine); Galeriu, D. [Institute of Atomic Physics, Bucharest (Romania); Garland, J. [Consultant (United Kingdom); Hoffman, O.; Nair, S.; Thiessen, K. [SENES, Oak Ridge, TN (United States); Miller, C. [Centre for Disease Control, Atlanta, GA (United States); Mueller, H. [GSF - Inst. fuer Strahlenschultz, Neuherberg (Germany); Kryshev, A. [Moscow State Univ. (Russian Federation)

    1996-10-01

    Resuspension can be an important secondary source of contamination after a release has stopped, as well as a source of contamination for people and areas not exposed to the original release. The inhalation of resuspended radionuclides contributes to the overall dose received by exposed individuals. Based on measurements collected after the Chernobyl accident, Scenario R was developed to provide an opportunity to test existing mathematical models of contamination resuspension. In particular, this scenario provided the opportunity to examine data and test models for atmospheric resuspension of radionuclides at several different locations from the release, to investigate resuspension processes on both local and regional scales, and to investigate the importance of seasonal variations of these processes. Participants in the test exercise were provided with information for three different types of locations: (1) within the 30-km zone, where local resuspension processes are expected to dominate; (2) a large urban location (Kiev) 120 km from the release site, where vehicular traffic is expected to be the dominant mechanism for resuspension; and (3) an agricultural area 40-60 km from the release site, where highly contaminated upwind 'hot spots' are expected to be important. Input information included characteristics of the ground contamination around specific sites, climatological data for the sites, characteristics of the terrain and topography, and locations of the sampling sites. Participants were requested to predict the average (quarterly and yearly) concentrations of 137 Cs in air at specified locations due to resuspension of Chernobyl fallout; predictions for 90 Sr and 239 + 240 Pu were also requested for one location and time point. Predictions for specified resuspension factors and rates were also requested. Most participants used empirical models for the resuspension factor as a function of time K(t), as opposed to process-based models. While many of

  4. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  5. Comparative Analyses of Zebrafish Anxiety-Like Behavior Using Conflict-Based Novelty Tests.

    Science.gov (United States)

    Kysil, Elana V; Meshalkina, Darya A; Frick, Erin E; Echevarria, David J; Rosemberg, Denis B; Maximino, Caio; Lima, Monica Gomes; Abreu, Murilo S; Giacomini, Ana C; Barcellos, Leonardo J G; Song, Cai; Kalueff, Allan V

    2017-06-01

    Modeling of stress and anxiety in adult zebrafish (Danio rerio) is increasingly utilized in neuroscience research and central nervous system (CNS) drug discovery. Representing the most commonly used zebrafish anxiety models, the novel tank test (NTT) focuses on zebrafish diving in response to potentially threatening stimuli, whereas the light-dark test (LDT) is based on fish scototaxis (innate preference for dark vs. bright areas). Here, we systematically evaluate the utility of these two tests, combining meta-analyses of published literature with comparative in vivo behavioral and whole-body endocrine (cortisol) testing. Overall, the NTT and LDT behaviors demonstrate a generally good cross-test correlation in vivo, whereas meta-analyses of published literature show that both tests have similar sensitivity to zebrafish anxiety-like states. Finally, NTT evokes higher levels of cortisol, likely representing a more stressful procedure than LDT. Collectively, our study reappraises NTT and LDT for studying anxiety-like states in zebrafish, and emphasizes their developing utility for neurobehavioral research. These findings can help optimize drug screening procedures by choosing more appropriate models for testing anxiolytic or anxiogenic drugs.

  6. Identification of Anisotropic Criteria for Stratified Soil Based on Triaxial Tests Results

    Science.gov (United States)

    Tankiewicz, Matylda; Kawa, Marek

    2017-09-01

    The paper presents the identification methodology of anisotropic criteria based on triaxial test results. The considered material is varved clay - a sedimentary soil occurring in central Poland which is characterized by the so-called "layered microstructure". The strength examination outcomes were identified by standard triaxial tests. The results include the estimated peak strength obtained for a wide range of orientations and confining pressures. Two models were chosen as potentially adequate for the description of the tested material, namely Pariseau and its conjunction with the Jaeger weakness plane. Material constants were obtained by fitting the model to the experimental results. The identification procedure is based on the least squares method. The optimal values of parameters are searched for between specified bounds by sequentially decreasing the distance between points and reducing the length of the searched range. For both considered models the optimal parameters have been obtained. The comparison of theoretical and experimental results as well as the assessment of the suitability of selected criteria for the specified range of confining pressures are presented.

  7. Storm surge model based on variational data assimilation method

    Directory of Open Access Journals (Sweden)

    Shi-li Huang

    2010-06-01

    Full Text Available By combining computation and observation information, the variational data assimilation method has the ability to eliminate errors caused by the uncertainty of parameters in practical forecasting. It was applied to a storm surge model based on unstructured grids with high spatial resolution meant for improving the forecasting accuracy of the storm surge. By controlling the wind stress drag coefficient, the variation-based model was developed and validated through data assimilation tests in an actual storm surge induced by a typhoon. In the data assimilation tests, the model accurately identified the wind stress drag coefficient and obtained results close to the true state. Then, the actual storm surge induced by Typhoon 0515 was forecast by the developed model, and the results demonstrate its efficiency in practical application.

  8. PENGARUH MODEL PROJECT BASED LEARNING TERHADAP KEMAMPUAN BERPIKIR KREATIF MATEMATIKA SISWA

    Directory of Open Access Journals (Sweden)

    Hesti Noviyana

    2017-09-01

    Full Text Available Abstract: The problems in this study relate to the learning model of Project Based Learning and students' creative thinking ability in mathematics. The purpose of the research to know the influence of the model of Project Based Learning on the ability to think creatively mathematics students VIII grade even semester SMP Negeri 3 Bandar Lampung lesson 2016/2017 . The research used experimental method with the population that is all students of class VIII with the amount of 347, while the sample is taken 2 class that is class VIII A as experiment class which amounted to 31, class VIII C as control class which amounted 30. The sample was taken using Cluster Random Sampling technique. To know the ability of creative thinking mathematics students authors perform tests in the form of essays as many as 5 questions that have been tested the validity and reliability. Hypothesis testing in this study using t test. From the results of hypothesis testing using t-test obtained t value = 14.27. From the distribution table t at the significant level of 5% is known t = 2.00 means t> t, so it can be concluded "There is Influence of Model Based Project Based on the Ability of Creative Thinking Mathematics Students".Keywords: Project Based Learning, creative thinking ability of mathematics

  9. Movable scour protection. Model test report

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, R.

    2002-07-01

    This report presents the results of a series of model tests with scour protection of marine structures. The objective of the model tests is to investigate the integrity of the scour protection during a general lowering of the surrounding seabed, for instance in connection with movement of a sand bank or with general subsidence. The scour protection in the tests is made out of stone material. Two different fractions have been used: 4 mm and 40 mm. Tests with current, with waves and with combined current and waves were carried out. The scour protection material was placed after an initial scour hole has evolved in the seabed around the structure. This design philosophy has been selected because the situation often is that the scour hole starts to generate immediately after the structure has been placed. It is therefore difficult to establish a scour protection at the undisturbed seabed if the scour material is placed after the main structure. Further, placing the scour material in the scour hole increases the stability of the material. Two types of structure have been used for the test, a Monopile and a Tripod foundation. Test with protection mats around the Monopile model was also carried out. The following main conclusions have emerged form the model tests with flat bed (i.e. no general seabed lowering): 1. The maximum scour depth found in steady current on sand bed was 1.6 times the cylinder diameter, 2. The minimum horizontal extension of the scour hole (upstream direction) was 2.8 times the cylinder diameter, corresponding to a slope of 30 degrees, 3. Concrete protection mats do not meet the criteria for a strongly erodible seabed. In the present test virtually no reduction in the scour depth was obtained. The main problem is the interface to the cylinder. If there is a void between the mats and the cylinder, scour will develop. Even with the protection mats that are tightly connected to the cylinder, scour is expected to develop as long as the mats allow for

  10. Stochastic models for strength of wind turbine blades using tests

    DEFF Research Database (Denmark)

    Toft, H.S.; Sørensen, John Dalsgaard

    2008-01-01

    The structural cost of wind turbine blades is dependent on the values of the partial safety factors which reflect the uncertainties in the design values, including statistical uncertainty from a limited number of tests. This paper presents a probabilistic model for ultimate and fatigue strength...... of wind turbine blades especially considering the influence of prior knowledge and test results and how partial safety factors can be updated when additional full-scale tests are performed. This updating is performed by adopting a probabilistic design basis based on Bayesian statistical methods....

  11. Geochemical Testing And Model Development - Residual Tank Waste Test Plan

    International Nuclear Information System (INIS)

    Cantrell, K.J.; Connelly, M.P.

    2010-01-01

    This Test Plan describes the testing and chemical analyses release rate studies on tank residual samples collected following the retrieval of waste from the tank. This work will provide the data required to develop a contaminant release model for the tank residuals from both sludge and salt cake single-shell tanks. The data are intended for use in the long-term performance assessment and conceptual model development.

  12. Business model stress testing : A practical approach to test the robustness of a business model

    NARCIS (Netherlands)

    Haaker, T.I.; Bouwman, W.A.G.A.; Janssen, W; de Reuver, G.A.

    Business models and business model innovation are increasingly gaining attention in practice as well as in academic literature. However, the robustness of business models (BM) is seldom tested vis-à-vis the fast and unpredictable changes in digital technologies, regulation and markets. The

  13. Model-based integration and testing : bridging the gap between academic theory and industrial practice

    NARCIS (Netherlands)

    Braspenning, N.C.W.M.

    2008-01-01

    For manufacturers of high-tech multi-disciplinary systems such as semiconductor equipment, the effort required for integration and system testing is ever increasing, while customers demand a shorter time-to-market.This book describes how executable models can replace unavailable component

  14. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  15. Accuracy tests of the tessellated SLBM model

    International Nuclear Information System (INIS)

    Ramirez, A L; Myers, S C

    2007-01-01

    We have compared the Seismic Location Base Model (SLBM) tessellated model (version 2.0 Beta, posted July 3, 2007) with the GNEMRE Unified Model. The comparison is done on a layer/depth-by-layer/depth and layer/velocity-by-layer/velocity comparison. The SLBM earth model is defined on a tessellation that spans the globe at a constant resolution of about 1 degree (Ballard, 2007). For the tests, we used the earth model in file ''unified( ) iasp.grid''. This model contains the top 8 layers of the Unified Model (UM) embedded in a global IASP91 grid. Our test queried the same set of nodes included in the UM model file. To query the model stored in memory, we used some of the functionality built into the SLBMInterface object. We used the method get InterpolatedPoint() to return desired values for each layer at user-specified points. The values returned include: depth to the top of each layer, layer velocity, layer thickness and (for the upper-mantle layer) velocity gradient. The SLBM earth model has an extra middle crust layer whose values are used when Pg/Lg phases are being calculated. This extra layer was not accessed by our tests. Figures 1 to 8 compare the layer depths, P velocities and P gradients in the UM and SLBM models. The figures show results for the three sediment layers, three crustal layers and the upper mantle layer defined in the UM model. Each layer in the models (sediment1, sediment2, sediment3, upper crust, middle crust, lower crust and upper mantle) is shown on a separate figure. The upper mantle P velocity and gradient distribution are shown on Figures 7 and 8. The left and center images in the top row of each figure is the rendering of depth to the top of the specified layer for the UM and SLBM models. When a layer has zero thickness, its depth is the same as that of the layer above. The right image in the top row is the difference between in layer depth for the UM and SLBM renderings. The left and center images in the bottom row of the figures are

  16. Modeling motive activation in the Operant Motives Test

    DEFF Research Database (Denmark)

    Runge, J. Malte; Lang, Jonas W. B.; Engeser, Stefan

    2016-01-01

    The Operant Motive Test (OMT) is a picture-based procedure that asks respondents to generate imaginative verbal behavior that is later coded for the presence of affiliation, power, and achievement-related motive content by trained coders. The OMT uses a larger number of pictures and asks...... on the dynamic model were .52, .62, and .73 for the affiliation, achievement, and power motive in the OMT, respectively. The second contribution of this article is a tutorial and R code that allows researchers to directly apply the dynamic Thurstonian IRT model to their data. The future use of the OMT...... respondents to provide more brief answers than earlier and more traditional picture-based implicit motive measures and has therefore become a frequently used measurement instrument in both research and practice. This article focuses on the psychometric response mechanism in the OMT and builds on recent...

  17. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  18. BWR regional instability model and verification on ringhals-1 test

    International Nuclear Information System (INIS)

    Hotta, Akitoshi; Suzawa, Yojiro

    1996-01-01

    Regional instability is known as one type of the coupled neutronic-thermohydraulic phenomena of boiling water reactors (BWRs), where the thermohydraulic density wave propagation mechanism is predominant. Historically, it has been simulated by the three-dimensional time domain code in spite of its significant computing time. On the other hand, there have been proposals to apply the frequency domain models in regional instability considering the subcriticality of the higher neutronic mode. However, their application still remains in corewide instability mainly because of the lack of more detailed methodological and empirical studies. In this study, the current version of the frequency domain model was extended and verified based on actual core regional instability measurement data. The mathematical model LAPUR, the well-known frequency domain stability code, was reviewed from the standpoint of pure thermohydraulics and neutronic-thermohydraulic interaction mechanisms. Based on the ex-core loop test data, the original LAPUR mixed friction and local pressure loss model was modified, taking into account the different dynamic behavior of these two pressure-loss mechanisms. The perturbation term of the two-phase friction multiplier, which is the sum of the derivative of void fraction and subcool enthalpy, was adjusted theoretically. The adequacy of the instability evaluation system was verified based on the Ringhals unit 1 test data, which were supplied to participants of the Organization for Economic Cooperation and Development/Nuclear Energy Agency BWR Stability Benchmark Project

  19. Study on Identification of Material Model Parameters from Compact Tension Test on Concrete Specimens

    Science.gov (United States)

    Hokes, Filip; Kral, Petr; Husek, Martin; Kala, Jiri

    2017-10-01

    Identification of a concrete material model parameters using optimization is based on a calculation of a difference between experimentally measured and numerically obtained data. Measure of the difference can be formulated via root mean squared error that is often used for determination of accuracy of a mathematical model in the field of meteorology or demography. The quality of the identified parameters is, however, determined not only by right choice of an objective function but also by the source experimental data. One of the possible way is to use load-displacement curves from three-point bending tests that were performed on concrete specimens. This option shows the significance of modulus of elasticity, tensile strength and specific fracture energy. Another possible option is to use experimental data from compact tension test. It is clear that the response in the second type of test is also dependent on the above mentioned material parameters. The question is whether the parameters identified within three-point bending test and within compact tension test will reach the same values. The presented article brings the numerical study of inverse identification of material model parameters from experimental data measured during compact tension tests. The article also presents utilization of the modified sensitivity analysis that calculates the sensitivity of the material model parameters for different parts of loading curve. The main goal of the article is to describe the process of inverse identification of parameters for plasticity-based material model of concrete and prepare data for future comparison with identified values of the material model parameters from different type of fracture tests.

  20. Nonlinear system modeling based on bilinear Laguerre orthonormal bases.

    Science.gov (United States)

    Garna, Tarek; Bouzrara, Kais; Ragot, José; Messaoud, Hassani

    2013-05-01

    This paper proposes a new representation of discrete bilinear model by developing its coefficients associated to the input, to the output and to the crossed product on three independent Laguerre orthonormal bases. Compared to classical bilinear model, the resulting model entitled bilinear-Laguerre model ensures a significant parameter number reduction as well as simple recursive representation. However, such reduction still constrained by an optimal choice of Laguerre pole characterizing each basis. To do so, we develop a pole optimization algorithm which constitutes an extension of that proposed by Tanguy et al.. The bilinear-Laguerre model as well as the proposed pole optimization algorithm are illustrated and tested on a numerical simulations and validated on the Continuous Stirred Tank Reactor (CSTR) System. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Usage models in reliability assessment of software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland)

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.).

  2. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  3. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  4. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  5. Overload prevention in model supports for wind tunnel model testing

    Directory of Open Access Journals (Sweden)

    Anton IVANOVICI

    2015-09-01

    Full Text Available Preventing overloads in wind tunnel model supports is crucial to the integrity of the tested system. Results can only be interpreted as valid if the model support, conventionally called a sting remains sufficiently rigid during testing. Modeling and preliminary calculation can only give an estimate of the sting’s behavior under known forces and moments but sometimes unpredictable, aerodynamically caused model behavior can cause large transient overloads that cannot be taken into account at the sting design phase. To ensure model integrity and data validity an analog fast protection circuit was designed and tested. A post-factum analysis was carried out to optimize the overload detection and a short discussion on aeroelastic phenomena is included to show why such a detector has to be very fast. The last refinement of the concept consists in a fast detector coupled with a slightly slower one to differentiate between transient overloads that decay in time and those that are the result of aeroelastic unwanted phenomena. The decision to stop or continue the test is therefore conservatively taken preserving data and model integrity while allowing normal startup loads and transients to manifest.

  6. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  7. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  8. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  9. Benchmarking in pathology: development of an activity-based costing model.

    Science.gov (United States)

    Burnett, Leslie; Wilson, Roger; Pfeffer, Sally; Lowry, John

    2012-12-01

    Benchmarking in Pathology (BiP) allows pathology laboratories to determine the unit cost of all laboratory tests and procedures, and also provides organisational productivity indices allowing comparisons of performance with other BiP participants. We describe 14 years of progressive enhancement to a BiP program, including the implementation of 'avoidable costs' as the accounting basis for allocation of costs rather than previous approaches using 'total costs'. A hierarchical tree-structured activity-based costing model distributes 'avoidable costs' attributable to the pathology activities component of a pathology laboratory operation. The hierarchical tree model permits costs to be allocated across multiple laboratory sites and organisational structures. This has enabled benchmarking on a number of levels, including test profiles and non-testing related workload activities. The development of methods for dealing with variable cost inputs, allocation of indirect costs using imputation techniques, panels of tests, and blood-bank record keeping, have been successfully integrated into the costing model. A variety of laboratory management reports are produced, including the 'cost per test' of each pathology 'test' output. Benchmarking comparisons may be undertaken at any and all of the 'cost per test' and 'cost per Benchmarking Complexity Unit' level, 'discipline/department' (sub-specialty) level, or overall laboratory/site and organisational levels. We have completed development of a national BiP program. An activity-based costing methodology based on avoidable costs overcomes many problems of previous benchmarking studies based on total costs. The use of benchmarking complexity adjustment permits correction for varying test-mix and diagnostic complexity between laboratories. Use of iterative communication strategies with program participants can overcome many obstacles and lead to innovations.

  10. Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench

    Science.gov (United States)

    Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan

    2016-04-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.

  11. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  12. Model-based thermal system design optimization for the James Webb Space Telescope

    Science.gov (United States)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-10-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  13. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    Science.gov (United States)

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  14. BIOMOVS test scenario model comparison using BIOPATH

    International Nuclear Information System (INIS)

    Grogan, H.A.; Van Dorp, F.

    1986-07-01

    This report presents the results of the irrigation test scenario, presented in the BIOMOVS intercomparison study, calculated by the computer code BIOPATH. This scenario defines a constant release of Tc-99 and Np-237 into groundwater that is used for irrigation. The system of compartments used to model the biosphere is based upon an area in northern Switzerland and is essentially the same as that used in Projekt Gewaehr to assess the radiological impact of a high level waste repository. Two separate irrigation methods are considered, namely ditch and overhead irrigation. Their influence on the resultant activities calculated in the groundwater, soil and different foodproducts, as a function of time, is evaluated. The sensitivity of the model to parameter variations is analysed which allows a deeper understanding of the model chain. These results are assessed subjectively in a first effort to realistically quantify the uncertainty associated with each calculated activity. (author)

  15. Blast Testing and Modelling of Composite Structures

    DEFF Research Database (Denmark)

    Giversen, Søren

    The motivation for this work is based on a desire for finding light weight alternatives to high strength steel as the material to use for armouring in military vehicles. With the use of high strength steel, an increase in the level of armouring has a significant impact on the vehicle weight......, affecting for example the manoeuvrability and top speed negatively, which ultimately affects the safety of the personal in the vehicle. Strong and light materials, such as fibre reinforced composites, could therefore act as substitutes for the high strength steel, and minimize the impact on the vehicle...... work this set-up should be improved such that the modelled pressure can be validated. For tests performed with a 250g charge load comparisons with model data showed poor agreement. This was found to be due to improper design of the modelled laminate panels, where the layer interface delamination...

  16. Large scale injection test (LASGIT) modelling

    International Nuclear Information System (INIS)

    Arnedo, D.; Olivella, S.; Alonso, E.E.

    2010-01-01

    . In addition the horizontal interfaces between the bentonite blocks and the vertical interfaces corresponding to the host rock and the canister walls contacts are considered different materials but the properties are similar to the ones in the bentonite. Actually, this is done because interfaces are believed to be a potential preferential path for gas migration through the buffer. The host rock and the canister are not included in the model due to its high stiffness with respect to bentonite. A constitutive model that considers non-linear elasticity and visco-plasticity based on BBM model is adopted for the bentonite and the interfaces. An embedded fracture permeability model in which permeability and retention curve depend on strains through a fracture aperture is considered in the hydraulic problem. The following stages of the experiment are simulated: - Construction of the isolation barrier inside the deposition hole, - Hydration stage 1 in which the liquid pressure is increased at the Filter Mats and de canister Injection Filters up to 1.5 MPa in an initial stage and up to 2.35 MPa in a second stage in order to saturate the buffer, - Hydraulic test 1, - Gas injection test 1. The simulation results will be compared to the experimental record of different variables: total stresses and liquid pressure at the rock wall, at the canister wall, and at some points within the bentonite buffer. It will be interesting to try to explain if preferential paths develop and where. This is controlled by the swelling capacity of the buffer and its ability to seal the interfaces initially not closed between the clay and the wall and between the blocks. (authors)

  17. The use of scale models in impact testing

    International Nuclear Information System (INIS)

    Donelan, P.J.; Dowling, A.R.

    1985-01-01

    Theoretical analysis, component testing and model flask testing are employed to investigate the validity of scale models for demonstrating the behaviour of Magnox flasks under impact conditions. Model testing is shown to be a powerful and convenient tool provided adequate care is taken with detail design and manufacture of models and with experimental control. (author)

  18. Theoretical Models, Assessment Frameworks and Test Construction.

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  19. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  20. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  1. Testing and modeling the dynamic response of foam materials for blast protection

    Science.gov (United States)

    Fitek, John H.

    The pressure wave released from an explosion can cause injury to the lungs. A personal armor system concept for blast lung injury protection consists of a polymer foam layer behind a rigid armor plate to be worn over the chest. This research develops a method for testing and modeling the dynamic response of foam materials to be used for down-selection of materials for this application. Constitutive equations for foam materials are incorporated into a lumped parameter model of the combined armor plate and foam system. Impact testing and shock tube testing are used to measure the foam model parameters and validate the model response to a pressure wave load. The plate and foam armor model is then coupled to a model of the human thorax. With a blast pressure wave input, the armor model is evaluated based on how it affects the injury-causing mechanism of chest wall motion. Results show that to reduce chest wall motion, the foam must compress at a relatively constant stress level, which requires a sufficient foam thickness.

  2. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  3. Cost-effectiveness of population based BRCA testing with varying Ashkenazi Jewish ancestry.

    Science.gov (United States)

    Manchanda, Ranjit; Patel, Shreeya; Antoniou, Antonis C; Levy-Lahad, Ephrat; Turnbull, Clare; Evans, D Gareth; Hopper, John L; Macinnis, Robert J; Menon, Usha; Jacobs, Ian; Legood, Rosa

    2017-11-01

    Population-based BRCA1/BRCA2 testing has been found to be cost-effective compared with family history-based testing in Ashkenazi-Jewish women were >30 years old with 4 Ashkenazi-Jewish grandparents. However, individuals may have 1, 2, or 3 Ashkenazi-Jewish grandparents, and cost-effectiveness data are lacking at these lower BRCA prevalence estimates. We present an updated cost-effectiveness analysis of population BRCA1/BRCA2 testing for women with 1, 2, and 3 Ashkenazi-Jewish grandparents. Decision analysis model. Lifetime costs and effects of population and family history-based testing were compared with the use of a decision analysis model. 56% BRCA carriers are missed by family history criteria alone. Analyses were conducted for United Kingdom and United States populations. Model parameters were obtained from the Genetic Cancer Prediction through Population Screening trial and published literature. Model parameters and BRCA population prevalence for individuals with 3, 2, or 1 Ashkenazi-Jewish grandparent were adjusted for the relative frequency of BRCA mutations in the Ashkenazi-Jewish and general populations. Incremental cost-effectiveness ratios were calculated for all Ashkenazi-Jewish grandparent scenarios. Costs, along with outcomes, were discounted at 3.5%. The time horizon of the analysis is "life-time," and perspective is "payer." Probabilistic sensitivity analysis evaluated model uncertainty. Population testing for BRCA mutations is cost-saving in Ashkenazi-Jewish women with 2, 3, or 4 grandparents (22-33 days life-gained) in the United Kingdom and 1, 2, 3, or 4 grandparents (12-26 days life-gained) in the United States populations, respectively. It is also extremely cost-effective in women in the United Kingdom with just 1 Ashkenazi-Jewish grandparent with an incremental cost-effectiveness ratio of £863 per quality-adjusted life-years and 15 days life gained. Results show that population-testing remains cost-effective at the £20,000-30000 per quality

  4. Modelling skylarks (Alauda arvensis) to predict impacts of changes in land management and policy: development and testing of an agent-based model

    DEFF Research Database (Denmark)

    Topping, Christopher John; Odderskær, Peter; Kahlert, Johnny Abildgaard

    2013-01-01

    of distribution and density, reproductive performance and seasonal changes in territory numbers. Data to support this was collected over a 13-year period and comprised detailed field observations of breeding birds and intensive surveys. The model was able to recreate the real world data patterns accurately......; it was also able to simultaneously fit a number of other secondary system properties which were not formally a part of the testing procedure. The correspondence of model output to real world data and sensitivity analysis are presented and discussed, and the model’s description is provided in ODdox format (a...... formal description inter-linked to the program code). Detailed and stringent tests for model performance were carried out, and standardised model description and open access to the source code were provided to open development of the skylark model to others. Over and above documenting the utility...

  5. Computerized Classification Testing with the Rasch Model

    Science.gov (United States)

    Eggen, Theo J. H. M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…

  6. Transmitted wavefront testing with large dynamic range based on computer-aided deflectometry

    Science.gov (United States)

    Wang, Daodang; Xu, Ping; Gong, Zhidong; Xie, Zhongmin; Liang, Rongguang; Xu, Xinke; Kong, Ming; Zhao, Jun

    2018-06-01

    The transmitted wavefront testing technique is demanded for the performance evaluation of transmission optics and transparent glass, in which the achievable dynamic range is a key issue. A computer-aided deflectometric testing method with fringe projection is proposed for the accurate testing of transmitted wavefronts with a large dynamic range. Ray tracing of the modeled testing system is carried out to achieve the virtual ‘null’ testing of transmitted wavefront aberrations. The ray aberration is obtained from the ray tracing result and measured slope, with which the test wavefront aberration can be reconstructed. To eliminate testing system modeling errors, a system geometry calibration based on computer-aided reverse optimization is applied to realize accurate testing. Both numerical simulation and experiments have been carried out to demonstrate the feasibility and high accuracy of the proposed testing method. The proposed testing method can achieve a large dynamic range compared with the interferometric method, providing a simple, low-cost and accurate way for the testing of transmitted wavefronts from various kinds of optics and a large amount of industrial transmission elements.

  7. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    Science.gov (United States)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  8. Towards Accurate Modelling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    Science.gov (United States)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-04-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  9. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2000-01-01

    Software test is the important stage in software process. The has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test class method based on interface object. In this method a set of basic test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  10. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2001-01-01

    Software test is the important stage in software process. There has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test-class method based on interface object. A set of basis test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  11. Monitor-Based Statistical Model Checking for Weighted Metric Temporal Logic

    DEFF Research Database (Denmark)

    Bulychev, Petr; David, Alexandre; Larsen, Kim Guldstrand

    2012-01-01

    We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction with desi......We present a novel approach and implementation for ana- lysing weighted timed automata (WTA) with respect to the weighted metric temporal logic (WMTL≤ ). Based on a stochastic semantics of WTAs, we apply statistical model checking (SMC) to estimate and test probabilities of satisfaction...

  12. Towards building a neural network model for predicting pile static load test curves

    Directory of Open Access Journals (Sweden)

    Alzo’ubi A. K.

    2018-01-01

    Full Text Available In the United Arab Emirates, Continuous Flight Auger piles are the most widely used type of deep foundation. To test the pile behaviour, the Static Load Test is routinely conducted in the field by increasing the dead load while monitoring the displacement. Although the test is reliable, it is expensive to conduct. This test is usually conducted in the UAE to verify the pile capacity and displacement as the load increase and decreases in two cycles. In this paper we will utilize the Artificial Neural Network approach to build a model that can predict a complete Static Load Pile test. We will show that by integrating the pile configuration, soil properties, and ground water table in one artificial neural network model, the Static Load Test can be predicted with confidence. We believe that based on this approach, the model is able to predict the entire pile load test from start to end. The suggested approach is an excellent tool to reduce the cost associated with such expensive tests or to predict pile’s performance ahead of the actual test.

  13. The design and testing of a caring teaching model based on the theoretical framework of caring in the Chinese Context: a mixed-method study.

    Science.gov (United States)

    Guo, Yujie; Shen, Jie; Ye, Xuchun; Chen, Huali; Jiang, Anli

    2013-08-01

    This paper aims to report the design and test the effectiveness of an innovative caring teaching model based on the theoretical framework of caring in the Chinese context. Since the 1970's, caring has been a core value in nursing education. In a previous study, a theoretical framework of caring in the Chinese context is explored employing a grounded theory study, considered beneficial for caring education. A caring teaching model was designed theoretically and a one group pre- and post-test quasi-experimental study was administered to test its effectiveness. From Oct, 2009 to Jul, 2010, a cohort of grade-2 undergraduate nursing students (n=64) in a Chinese medical school was recruited to participate in the study. Data were gathered through quantitative and qualitative methods to evaluate the effectiveness of the caring teaching model. The caring teaching model created an esthetic situation and experiential learning style for teaching caring that was integrated within the curricula. Quantitative data from the quasi-experimental study showed that the post-test scores of each item were higher than those on the pre-test (p<0.01). Thematic analysis of 1220 narratives from students' caring journals and reports of participant class observation revealed two main thematic categories, which reflected, from the students' points of view, the development of student caring character and the impact that the caring teaching model had on this regard. The model could be used as an integrated approach to teach caring in nursing curricula. It would also be beneficial for nursing administrators in cultivating caring nurse practitioners. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Animal models for testing anti-prion drugs.

    Science.gov (United States)

    Fernández-Borges, Natalia; Elezgarai, Saioa R; Eraña, Hasier; Castilla, Joaquín

    2013-01-01

    Prion diseases belong to a group of fatal infectious diseases with no effective therapies available. Throughout the last 35 years, less than 50 different drugs have been tested in different experimental animal models without hopeful results. An important limitation when searching for new drugs is the existence of appropriate models of the disease. The three different possible origins of prion diseases require the existence of different animal models for testing anti-prion compounds. Wild type, over-expressing transgenic mice and other more sophisticated animal models have been used to evaluate a diversity of compounds which some of them were previously tested in different in vitro experimental models. The complexity of prion diseases will require more pre-screening studies, reliable sporadic (or spontaneous) animal models and accurate chemical modifications of the selected compounds before having an effective therapy against human prion diseases. This review is intended to put on display the more relevant animal models that have been used in the search of new antiprion therapies and describe some possible procedures when handling chemical compounds presumed to have anti-prion activity prior to testing them in animal models.

  15. Characterization of Oribtal Debris via Hyper-Velocity Ground-Based Tests

    Science.gov (United States)

    Cowardin, H.

    2015-01-01

    Existing DoD and NASA satellite breakup models are based on a key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve the break-up models and the NASA Size Estimation Model (SEM) for events involving more modern satellite designs, the NASA Orbital Debris Program Office has worked in collaboration with the University of Florida to replicate a hypervelocity impact using a satellite built with modern-day spacecraft materials and construction techniques. The spacecraft, called DebriSat, was intended to be a representative of modern LEO satellites and all major designs decisions were reviewed and approved by subject matter experts at Aerospace Corporation. DebriSat is composed of 7 major subsystems including attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. All fragments down to 2 mm is size will be characterized via material, size, shape, bulk density, and the associated data will be stored in a database for multiple users to access. Laboratory radar and optical measurements will be performed on a subset of fragments to provide a better understanding of the data products from orbital debris acquired from ground-based radars and telescopes. The resulting data analysis from DebriSat will be used to update break-up models and develop the first optical SEM in conjunction with updates into the current NASA SEM. The characterization of the fragmentation will be discussed in the subsequent presentation.

  16. Optimal closed-loop identification test design for internal model control

    NARCIS (Netherlands)

    Zhu, Y.; Bosch, van den P.P.J.

    2000-01-01

    In this work, optimal closed-loop test design for control is studied. Simple design formulas are derived based on the asymptotic theory of Ljung. The control scheme used is internal model control (IMC) and the design constraint is the power of the process output or that of the reference signal. The

  17. Optimal test intervals of standby components based on actual plant-specific data

    International Nuclear Information System (INIS)

    Jones, R.B.; Bickel, J.H.

    1987-01-01

    Based on standard reliability analysis techniques, both under testing and over testing affect the availability of standby components. If tests are performed too often, unavailability is increased since the equipment is being used excessively. Conversely if testing is performed too infrequently, the likelihood of component unavailability is also increased due to the formation of rust, heat or radiation damage, dirt infiltration, etc. Thus from a physical perspective, an optimal test interval should exist which minimizes unavailability. This paper illustrates the application of an unavailability model that calculates optimal testing intervals for components with a failure database. (orig./HSCH)

  18. Testing for Stationarity and Nonlinearity of Daily Streamflow Time Series Based on Different Statistical Tests (Case Study: Upstream Basin Rivers of Zarrineh Roud Dam

    Directory of Open Access Journals (Sweden)

    Farshad Fathian

    2017-02-01

    Full Text Available Introduction: Time series models are one of the most important tools for investigating and modeling hydrological processes in order to solve problems related to water resources management. Many hydrological time series shows nonstationary and nonlinear behaviors. One of the important hydrological modeling tasks is determining the existence of nonstationarity and the way through which we can access the stationarity accordingly. On the other hand, streamflow processes are usually considered as nonlinear mechanisms while in many studies linear time series models are used to model streamflow time series. However, it is not clear what kind of nonlinearity is acting underlying the streamflowprocesses and how intensive it is. Materials and Methods: Streamflow time series of 6 hydro-gauge stations located in the upstream basin rivers of ZarrinehRoud dam (located in the southern part of Urmia Lake basin have been considered to investigate stationarity and nonlinearity. All data series used here to startfrom January 1, 1997, and end on December 31, 2011. In this study, stationarity is tested by ADF and KPSS tests and nonlinearity is tested by BDS, Keenan and TLRT tests. The stationarity test is carried out with two methods. Thefirst one method is the augmented Dickey-Fuller (ADF unit root test first proposed by Dickey and Fuller (1979 and modified by Said and Dickey (1984, which examinsthe presence of unit roots in time series.The second onemethod is KPSS test, proposed by Kwiatkowski et al. (1992, which examinesthestationarity around a deterministic trend (trend stationarity and the stationarity around a fixed level (level stationarity. The BDS test (Brock et al., 1996 is a nonparametric method for testing the serial independence and nonlinear structure in time series based on the correlation integral of the series. The null hypothesis is the time series sample comes from an independent identically distributed (i.i.d. process. The alternative hypothesis

  19. Strategy for a Rock Mechanics Site Descriptive Model. A test case based on data from the Aespoe HRL

    International Nuclear Information System (INIS)

    Hudson, John A

    2002-06-01

    In anticipation of the SKB Site Investigations for radioactive waste disposal, an approach has been developed for the Rock Mechanics Site Descriptive Model. This approach was tested by predicting the rock mechanics properties of a 600 m x 180 m x 120 m rock volume at the Aespoe Hard Rock Laboratory (HRL) using limited borehole data of the type typically obtained during a site investigation. These predicted properties were then compared with 'best estimate' properties obtained from a study of the test rock volume using additional information, mainly tunnel data. The exercise was known as the Test Case, and is the subject of this Report. Three modelling techniques were used to predict the rock properties: the 'empirical approach' - the rock properties were estimated using rock mass classification schemes and empirical correlation formulae; the 'theoretical approach' - the rock properties were estimated using numerical modelling techniques; and the 'stress approach' - the rock stress state was estimated using primary data and numerical modelling. These approaches are described separately and respectively. Following an explanation of the context for the Test Case within the strategy for developing the Rock Mechanics Site Descriptive Model, conditions at the Aespoe HRL are described in Chapter 2. The Test Case organization and the suite of nine Protocols used to ensure that the work was appropriately guided and co-ordinated are described in Chapter 3. The methods for predicting the rock properties and the rock stress, and comparisons with the 'best estimate' properties of the actual conditions, are presented in Chapters 4 and 5. Finally, the conclusions from this Test Case exercise are given in Chapter 6. General recommendations for the management of this type of Test Case are also included

  20. Strategy for a Rock Mechanics Site Descriptive Model. A test case based on data from the Aespoe HRL

    Energy Technology Data Exchange (ETDEWEB)

    Hudson, John A (ed.) [Rock Engineering Consultants, Welwyn Garden City (United Kingdom)

    2002-06-01

    In anticipation of the SKB Site Investigations for radioactive waste disposal, an approach has been developed for the Rock Mechanics Site Descriptive Model. This approach was tested by predicting the rock mechanics properties of a 600 m x 180 m x 120 m rock volume at the Aespoe Hard Rock Laboratory (HRL) using limited borehole data of the type typically obtained during a site investigation. These predicted properties were then compared with 'best estimate' properties obtained from a study of the test rock volume using additional information, mainly tunnel data. The exercise was known as the Test Case, and is the subject of this Report. Three modelling techniques were used to predict the rock properties: the 'empirical approach' - the rock properties were estimated using rock mass classification schemes and empirical correlation formulae; the 'theoretical approach' - the rock properties were estimated using numerical modelling techniques; and the 'stress approach' - the rock stress state was estimated using primary data and numerical modelling. These approaches are described separately and respectively. Following an explanation of the context for the Test Case within the strategy for developing the Rock Mechanics Site Descriptive Model, conditions at the Aespoe HRL are described in Chapter 2. The Test Case organization and the suite of nine Protocols used to ensure that the work was appropriately guided and co-ordinated are described in Chapter 3. The methods for predicting the rock properties and the rock stress, and comparisons with the 'best estimate' properties of the actual conditions, are presented in Chapters 4 and 5. Finally, the conclusions from this Test Case exercise are given in Chapter 6. General recommendations for the management of this type of Test Case are also included.

  1. 1/3-scale model testing program

    International Nuclear Information System (INIS)

    Yoshimura, H.R.; Attaway, S.W.; Bronowski, D.R.; Uncapher, W.L.; Huerta, M.; Abbott, D.G.

    1989-01-01

    This paper describes the drop testing of a one-third scale model transport cask system. Two casks were supplied by Transnuclear, Inc. (TN) to demonstrate dual purpose shipping/storage casks. These casks will be used to ship spent fuel from DOEs West Valley demonstration project in New York to the Idaho National Engineering Laboratory (INEL) for long term spent fuel dry storage demonstration. As part of the certification process, one-third scale model tests were performed to obtain experimental data. Two 9-m (30-ft) drop tests were conducted on a mass model of the cask body and scaled balsa and redwood filled impact limiters. In the first test, the cask system was tested in an end-on configuration. In the second test, the system was tested in a slap-down configuration where the axis of the cask was oriented at a 10 degree angle with the horizontal. Slap-down occurs for shallow angle drops where the primary impact at one end of the cask is followed by a secondary impact at the other end. The objectives of the testing program were to (1) obtain deceleration and displacement information for the cask and impact limiter system, (2) obtain dynamic force-displacement data for the impact limiters, (3) verify the integrity of the impact limiter retention system, and (4) examine the crush behavior of the limiters. This paper describes both test results in terms of measured deceleration, post test deformation measurements, and the general structural response of the system

  2. Earthquake induced rock shear through a deposition hole - modelling of three scale tests for validation of models

    International Nuclear Information System (INIS)

    Boergesson, Lennart; Hernelind, Jan

    2012-01-01

    rate for each element. A similar model, based on tensile tests on the copper used in the scale tests, has been used for the copper. Two element models were used. In one of them (model A) the bentonite was divided into three parts with different densities according to the measurements made during dismantling and sampling. In the other one (model B) the same density, corresponding to the weighted mean value, was used for all bentonite in the test. The reason for using both these models was to investigate whether the simplification done in SR-Site, where only one density was modelled and thus no consideration was taken to the incomplete homogenisation that remains after water saturation and swelling, would affect the results significantly. The results show a remarkable agreement between modelled and measured results, in spite of the complexity of the models and the difficulties to measure stresses and strains under the very fast tests. In addition there was less than two per cent difference between the results of the simplified model with one density and the model with three densities. The modelling results of both models were found to agree well with the measurements, which validates the SR-Site modelling of the rock shear case. It should be emphasized that the calculations have been done without any changes or adaptations of material models or parameter values to test results. The overall conclusion is that the modelling technique, the element mesh and the material models used in these analyses are well fitted and useful for this type of modelling

  3. Determination of Geometrical REVs Based on Volumetric Fracture Intensity and Statistical Tests

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2018-05-01

    Full Text Available This paper presents a method to estimate a representative element volume (REV of a fractured rock mass based on the volumetric fracture intensity P32 and statistical tests. A 150 m × 80 m × 50 m 3D fracture network model was generated based on field data collected at the Maji dam site by using the rectangular window sampling method. The volumetric fracture intensity P32 of each cube was calculated by varying the cube location in the generated 3D fracture network model and varying the cube side length from 1 to 20 m, and the distribution of the P32 values was described. The size effect and spatial effect of the fractured rock mass were studied; the P32 values from the same cube sizes and different locations were significantly different, and the fluctuation in P32 values clearly decreases as the cube side length increases. In this paper, a new method that comprehensively considers the anisotropy of rock masses, simplicity of calculation and differences between different methods was proposed to estimate the geometrical REV size. The geometrical REV size of the fractured rock mass was determined based on the volumetric fracture intensity P32 and two statistical test methods, namely, the likelihood ratio test and the Wald–Wolfowitz runs test. The results of the two statistical tests were substantially different; critical cube sizes of 13 m and 12 m were estimated by the Wald–Wolfowitz runs test and the likelihood ratio test, respectively. Because the different test methods emphasize different considerations and impact factors, considering a result that these two tests accept, the larger cube size, 13 m, was selected as the geometrical REV size of the fractured rock mass at the Maji dam site in China.

  4. BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool

    Science.gov (United States)

    Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.

    2006-01-01

    BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…

  5. Towards a Development Environment for Model Based Test Design

    OpenAIRE

    Jing, Han

    2008-01-01

    Within the UP IP I&V organization there is high focus on increasing the ability to predict product quality in a cost efficient way. Test automation has therefore been an important enabler for us. The IP test design environment is continuously evolving and the investigations will show which improvements that is most important to implement in short and long term. In Ericsson UP IP I&V, the test automation framework environments are severed to complete some process by automated method, f...

  6. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  7. Evaluating an Automated Number Series Item Generator Using Linear Logistic Test Models

    Directory of Open Access Journals (Sweden)

    Bao Sheng Loe

    2018-04-01

    Full Text Available This study investigates the item properties of a newly developed Automatic Number Series Item Generator (ANSIG. The foundation of the ANSIG is based on five hypothesised cognitive operators. Thirteen item models were developed using the numGen R package and eleven were evaluated in this study. The 16-item ICAR (International Cognitive Ability Resource1 short form ability test was used to evaluate construct validity. The Rasch Model and two Linear Logistic Test Model(s (LLTM were employed to estimate and predict the item parameters. Results indicate that a single factor determines the performance on tests composed of items generated by the ANSIG. Under the LLTM approach, all the cognitive operators were significant predictors of item difficulty. Moderate to high correlations were evident between the number series items and the ICAR test scores, with high correlation found for the ICAR Letter-Numeric-Series type items, suggesting adequate nomothetic span. Extended cognitive research is, nevertheless, essential for the automatic generation of an item pool with predictable psychometric properties.

  8. Trait-based representation of biological nitrification: Model development, testing, and predicted community composition

    Directory of Open Access Journals (Sweden)

    Nick eBouskill

    2012-10-01

    Full Text Available Trait-based microbial models show clear promise as tools to represent the diversity and activity of microorganisms across ecosystem gradients. These models parameterize specific traits that determine the relative fitness of an ‘organism’ in a given environment, and represent the complexity of biological systems across temporal and spatial scales. In this study we introduce a microbial community trait-based modeling framework (MicroTrait focused on nitrification (MicroTrait-N that represents the ammonia-oxidizing bacteria (AOB and ammonia-oxidizing archaea (AOA and nitrite oxidizing bacteria (NOB using traits related to enzyme kinetics and physiological properties. We used this model to predict nitrifier diversity, ammonia (NH3 oxidation rates and nitrous oxide (N2O production across pH, temperature and substrate gradients. Predicted nitrifier diversity was predominantly determined by temperature and substrate availability, the latter was strongly influenced by pH. The model predicted that transient N2O production rates are maximized by a decoupling of the AOB and NOB communities, resulting in an accumulation and detoxification of nitrite to N2O by AOB. However, cumulative N2O production (over six month simulations is maximized in a system where the relationship between AOB and NOB is maintained. When the reactions uncouple, the AOB become unstable and biomass declines rapidly, resulting in decreased NH3 oxidation and N2O production. We evaluated this model against site level chemical datasets from the interior of Alaska and accurately simulated NH3 oxidation rates and the relative ratio of AOA:AOB biomass. The predicted community structure and activity indicate (a parameterization of a small number of traits may be sufficient to broadly characterize nitrifying community structure and (b changing decadal trends in climate and edaphic conditions could impact nitrification rates in ways that are not captured by extant biogeochemical models.

  9. Modeling of Micro Deval abrasion loss based on some rock properties

    Science.gov (United States)

    Capik, Mehmet; Yilmaz, Ali Osman

    2017-10-01

    Aggregate is one of the most widely used construction material. The quality of the aggregate is determined using some testing methods. Among these methods, the Micro Deval Abrasion Loss (MDAL) test is commonly used for the determination of the quality and the abrasion resistance of aggregate. The main objective of this study is to develop models for the prediction of MDAL from rock properties, including uniaxial compressive strength, Brazilian tensile strength, point load index, Schmidt rebound hardness, apparent porosity, void ratio Cerchar abrasivity index and Bohme abrasion test are examined. Additionally, the MDAL is modeled using simple regression analysis and multiple linear regression analysis based on the rock properties. The study shows that the MDAL decreases with the increase of uniaxial compressive strength, Brazilian tensile strength, point load index, Schmidt rebound hardness and Cerchar abrasivity index. It is also concluded that the MDAL increases with the increase of apparent porosity, void ratio and Bohme abrasion test. The modeling results show that the models based on Bohme abrasion test and L type Schmidt rebound hardness give the better forecasting performances for the MDAL. More models, including the uniaxial compressive strength, the apparent porosity and Cerchar abrasivity index, are developed for the rapid estimation of the MDAL of the rocks. The developed models were verified by statistical tests. Additionally, it can be stated that the proposed models can be used as a forecasting for aggregate quality.

  10. Model design for Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Chen, P.C.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The LSST is a joint effort among many interested parties. Electric Power Research Institute (EPRI) and Taipower are the organizers of the program and have the lead in planning and managing the program. Other organizations participating in the LSST program are US Nuclear Regulatory Commission (NRC), the Central Research Institute of Electric Power Industry (CRIEPI), the Tokyo Electric Power Company (TEPCO), the Commissariat A L'Energie Atomique (CEA), Electricite de France (EdF) and Framatome. The LSST was initiated in January 1990, and is envisioned to be five years in duration. Based on the assumption of stiff soil and confirmed by soil boring and geophysical results the test model was designed to provide data needed for SSI studies covering: free-field input, nonlinear soil response, non-rigid body SSI, torsional response, kinematic interaction, spatial incoherency and other effects. Taipower had the lead in design of the test model and received significant input from other LSST members. Questions raised by LSST members were on embedment effects, model stiffness, base shear, and openings for equipment. This paper describes progress in site preparation, design and construction of the model and development of an instrumentation plan

  11. Dynamic analysis of ITER tokamak. Based on results of vibration test using scaled model

    International Nuclear Information System (INIS)

    Takeda, Nobukazu; Kakudate, Satoshi; Nakahira, Masataka

    2005-01-01

    The vibration experiments of the support structures with flexible plates for the ITER major components such as toroidal field coil (TF coil) and vacuum vessel (VV) were performed using small-sized flexible plates aiming to obtain its basic mechanical characteristics such as dependence of the stiffness on the loading angle. The experimental results were compared with the analytical ones in order to estimate an adequate analytical model for ITER support structure with flexible plates. As a result, the bolt connection of the flexible plates on the base plate strongly affected on the stiffness of the flexible plates. After studies of modeling the connection of the bolts, it is found that the analytical results modeling the bolts with finite stiffness only in the axial direction and infinite stiffness in the other directions agree well with the experimental ones. Based on this, numerical analysis regarding the actual support structure of the ITER VV and TF coil was performed. The support structure composed of flexible plates and connection bolts was modeled as a spring composed of only two spring elements simulating the in-plane and out-of-plane stiffness of the support structure with flexible plates including the effect of connection bolts. The stiffness of both spring models for VV and TF coil agree well with that of shell models, simulating actual structures such as flexible plates and connection bolts based on the experimental results. It is therefore found that the spring model with the only two values of stiffness enables to simplify the complicated support structure with flexible plates for the dynamic analysis of the VV and TF coil. Using the proposed spring model, the dynamic analysis of the VV and TF coil for the ITER were performed to estimate the integrity under the design earthquake. As a result, it is found that the maximum relative displacement of 8.6 mm between VV and TF coil is much less than 100 mm, so that the integrity of the VV and TF coil of the

  12. Design and Test of an Oscillation-based System Architecture for DNA Sensor Arrays

    NARCIS (Netherlands)

    Liu, Hongyuan; Kerkhoff, Hans G.; Richardson, Andrew; Zhang, X.; Nouet, Pascal; Azais, Florence

    2005-01-01

    A DfT strategy for MEMS-based DNA sensors is investigated in this paper. Based on a fault-free and defect model developed for a single sensing element and the VHDL-AMS simulation results, it is implied that an oscillation-based interface might be a potential solution for both testing and read out of

  13. Optimization of inverse model identification for multi-axial test rig control

    Directory of Open Access Journals (Sweden)

    Müller Tino

    2016-01-01

    Full Text Available Laboratory testing of multi-axial fatigue situations improves repeatability and allows a time condensing of tests which can be carried out until component failure, compared to field testing. To achieve realistic and convincing durability results, precise load data reconstruction is necessary. Cross-talk and a high number of degrees of freedom negatively affect the control accuracy. Therefore a multiple input/multiple output (MIMO model of the system, capturing all inherent cross-couplings is identified. In a first step the model order is estimated based on the physical fundamentals of a one channel hydraulic-servo system. Subsequently, the structure of the MIMO model is optimized using correlation of the outputs, to increase control stability and reduce complexity of the parameter optimization. The identification process is successfully applied to the iterative control of a multi-axial suspension rig. The results show accurate control, with increased stability compared to control without structure optimization.

  14. Temperature Buffer Test. Final THM modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan [Clay Technology AB, Lund (Sweden); Ledesma, Alberto; Jacinto, Abel [UPC, Universitat Politecnica de Catalunya, Barcelona (Spain)

    2012-01-15

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code{sub B}right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code{sub B}right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  15. Temperature Buffer Test. Final THM modelling

    International Nuclear Information System (INIS)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan; Ledesma, Alberto; Jacinto, Abel

    2012-01-01

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code B right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code B right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  16. Modification of Concrete Damaged Plasticity model. Part II: Formulation and numerical tests

    Directory of Open Access Journals (Sweden)

    Kamińska Inez

    2017-01-01

    Full Text Available A refined model for elastoplastic damaged material is formulated based on the plastic potential introduced in Part I [1]. Considered model is an extension of Concrete Damaged Plasticity material implemented in Abaqus [2]. In the paper the stiffness tensor for elastoplastic damaged behaviour is derived. In order to validate the model, computations for the uniaxial tests are performed. Response of the model for various cases of parameter’s choice is shown and compared to the response of the CDP model.

  17. Numerical Well Testing Interpretation Model and Applications in Crossflow Double-Layer Reservoirs by Polymer Flooding

    Directory of Open Access Journals (Sweden)

    Haiyang Yu

    2014-01-01

    Full Text Available This work presents numerical well testing interpretation model and analysis techniques to evaluate formation by using pressure transient data acquired with logging tools in crossflow double-layer reservoirs by polymer flooding. A well testing model is established based on rheology experiments and by considering shear, diffusion, convection, inaccessible pore volume (IPV, permeability reduction, wellbore storage effect, and skin factors. The type curves were then developed based on this model, and parameter sensitivity is analyzed. Our research shows that the type curves have five segments with different flow status: (I wellbore storage section, (II intermediate flow section (transient section, (III mid-radial flow section, (IV crossflow section (from low permeability layer to high permeability layer, and (V systematic radial flow section. The polymer flooding field tests prove that our model can accurately determine formation parameters in crossflow double-layer reservoirs by polymer flooding. Moreover, formation damage caused by polymer flooding can also be evaluated by comparison of the interpreted permeability with initial layered permeability before polymer flooding. Comparison of the analysis of numerical solution based on flow mechanism with observed polymer flooding field test data highlights the potential for the application of this interpretation method in formation evaluation and enhanced oil recovery (EOR.

  18. Horns Rev II, 2D-Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Brorsen, Michael

    This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU), Denmark. The starting point for the present report is the previously carried out run-up tests described in Lykke Andersen & Frigaard, 2006. The......-shaped access platforms on piles. The Model tests include mainly regular waves and a few irregular wave tests. These tests have been conducted at Aalborg University from 9. November, 2006 to 17. November, 2006.......This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU), Denmark. The starting point for the present report is the previously carried out run-up tests described in Lykke Andersen & Frigaard, 2006....... The objective of the tests was to investigate the impact pressures generated on a horizontal platform and a cone platform for selected sea states calibrated by Lykke Andersen & Frigaard, 2006. The measurements should be used for assessment of slamming coefficients for the design of horizontal and cone...

  19. Development and Test of TQC models, LARP Technological Quadrupole Magnets

    Energy Technology Data Exchange (ETDEWEB)

    Bossert, R.C.; Ambrosio, G.; Andreev, N.; Barzi, E.; Carcagno, R.; Feher, S.; Kashikhin, V.S.; Kashikhin, V.V.; Nobrega, F.; Novitski, I.; Orris, D.; Tartaglia, M.; Zlobin, A.V.; Caspi, S.; Dietderich, D.; Ferracin, P.; Hafalia, A.R.; Sabbi, G.

    2008-06-01

    In support of the development of a large-aperture Nb3Sn superconducting quadrupole for the Large Hadron Collider (LHC) luminosity upgrade, two-layer quadrupole models (TQC and TQS) with 90mm aperture are being constructed at Fermilab and LBNL within the framework of the US LHC Accelerator Research Program (LARP). This paper describes the development and test of TQC01b, the second TQC model, and the experience during construction of TQE02 and TQC02, subsequent models in the series. ANSYS analysis of the mechanical structure, its underlying assumptions, and changes based on experience with TQC01 are presented and discussed. Construction experience, in-process measurements, and modifications to the assembly since TQC01 are described. The test results presented here include magnet strain and quench performance during training of TQC01b, as well as quench studies of current ramp rate dependence.

  20. Development and Test of TQC models, LARP Technological Quadrupole Magnets

    International Nuclear Information System (INIS)

    Bossert, R.C.; Ambrosio, G.; Andreev, N.; Barzi, E.; Carcagno, R.; Feher, S.; Kashikhin, V.S.; Kashikhin, V.V.; Nobrega, F.; Novitski, I.; Orris, D.; Tartaglia, M.; Zlobin, A.V.; Caspi, S.; Dietderich, D.; Ferracin, P.; Hafalia, A.R.; Sabbi, G.

    2008-01-01

    In support of the development of a large-aperture Nb3Sn superconducting quadrupole for the Large Hadron Collider (LHC) luminosity upgrade, two-layer quadrupole models (TQC and TQS) with 90mm aperture are being constructed at Fermilab and LBNL within the framework of the US LHC Accelerator Research Program (LARP). This paper describes the development and test of TQC01b, the second TQC model, and the experience during construction of TQE02 and TQC02, subsequent models in the series. ANSYS analysis of the mechanical structure, its underlying assumptions, and changes based on experience with TQC01 are presented and discussed. Construction experience, in-process measurements, and modifications to the assembly since TQC01 are described. The test results presented here include magnet strain and quench performance during training of TQC01b, as well as quench studies of current ramp rate dependence

  1. Testing constancy of unconditional variance in volatility models by misspecification and specification tests

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Terasvirta, Timo

    The topic of this paper is testing the hypothesis of constant unconditional variance in GARCH models against the alternative that the unconditional variance changes deterministically over time. Tests of this hypothesis have previously been performed as misspecification tests after fitting a GARCH...... models. An application to exchange rate returns is included....

  2. Testing and reference model analysis of FTTH system

    Science.gov (United States)

    Feng, Xiancheng; Cui, Wanlong; Chen, Ying

    2009-08-01

    With rapid development of Internet and broadband access network, the technologies of xDSL, FTTx+LAN , WLAN have more applications, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network.. Fiber to the Home (FTTH) will be the goal of telecommunications cable broadband access. In accordance with the development trend of telecommunication services, to enhance the capacity of integrated access network, to achieve triple-play (voice, data, image), based on the existing optical Fiber to the curb (FTTC), Fiber To The Zone (FTTZ), Fiber to the Building (FTTB) user optical cable network, the optical fiber can extend to the FTTH system of end-user by using EPON technology. The article first introduced the basic components of FTTH system; and then explain the reference model and reference point for testing of the FTTH system; Finally, by testing connection diagram, the testing process, expected results, primarily analyze SNI Interface Testing, PON interface testing, Ethernet performance testing, UNI interface testing, Ethernet functional testing, PON functional testing, equipment functional testing, telephone functional testing, operational support capability testing and so on testing of FTTH system. ...

  3. Deformation modeling and the strain transient dip test

    International Nuclear Information System (INIS)

    Jones, W.B.; Rohde, R.W.; Swearengen, J.C.

    1980-01-01

    Recent efforts in material deformation modeling reveal a trend toward unifying creep and plasticity with a single rate-dependent formulation. While such models can describe actual material deformation, most require a number of different experiments to generate model parameter information. Recently, however, a new model has been proposed in which most of the requisite constants may be found by examining creep transients brought about through abrupt changes in creep stress (strain transient dip test). The critical measurement in this test is the absence of a resolvable creep rate after a stress drop. As a consequence, the result is extraordinarily sensitive to strain resolution as well as machine mechanical response. This paper presents the design of a machine in which these spurious effects have been minimized and discusses the nature of the strain transient dip test using the example of aluminum. It is concluded that the strain transient dip test is not useful as the primary test for verifying any micromechanical model of deformation. Nevertheless, if a model can be developed which is verifiable by other experimentts, data from a dip test machine may be used to generate model parameters

  4. Methods for testing transport models

    International Nuclear Information System (INIS)

    Singer, C.; Cox, D.

    1993-01-01

    This report documents progress to date under a three-year contract for developing ''Methods for Testing Transport Models.'' The work described includes (1) choice of best methods for producing ''code emulators'' for analysis of very large global energy confinement databases, (2) recent applications of stratified regressions for treating individual measurement errors as well as calibration/modeling errors randomly distributed across various tokamaks, (3) Bayesian methods for utilizing prior information due to previous empirical and/or theoretical analyses, (4) extension of code emulator methodology to profile data, (5) application of nonlinear least squares estimators to simulation of profile data, (6) development of more sophisticated statistical methods for handling profile data, (7) acquisition of a much larger experimental database, and (8) extensive exploratory simulation work on a large variety of discharges using recently improved models for transport theories and boundary conditions. From all of this work, it has been possible to define a complete methodology for testing new sets of reference transport models against much larger multi-institutional databases

  5. SABATPG-A Structural Analysis Based Automatic Test Generation System

    Institute of Scientific and Technical Information of China (English)

    李忠诚; 潘榆奇; 闵应骅

    1994-01-01

    A TPG system, SABATPG, is given based on a generic structural model of large circuits. Three techniques of partial implication, aftereffect of identified undetectable faults and shared sensitization with new concepts of localization and aftereffect are employed in the system to improve FAN algorithm. Experiments for the 10 ISCAS benchmark circuits show that the computing time of SABATPG for test generation is 19.42% less than that of FAN algorithm.

  6. Solving large test-day models by iteration on data and preconditioned conjugate gradient.

    Science.gov (United States)

    Lidauer, M; Strandén, I; Mäntysaari, E A; Pösö, J; Kettunen, A

    1999-12-01

    A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model ¿random regression test-day model¿ required 122 ¿305¿ rounds of iteration to converge with the reference algorithm, but only 88 ¿149¿ were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.

  7. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  8. Modelling the pile load test

    OpenAIRE

    Prekop Ľubomír

    2017-01-01

    This paper deals with the modelling of the load test of horizontal resistance of reinforced concrete piles. The pile belongs to group of piles with reinforced concrete heads. The head is pressed with steel arches of a bridge on motorway D1 Jablonov - Studenec. Pile model was created in ANSYS with several models of foundation having properties found out from geotechnical survey. Finally some crucial results obtained from computer models are presented and compared with these obtained from exper...

  9. Thermal Testing and Model Correlation of the Magnetospheric Multiscale (MMS) Observatories

    Science.gov (United States)

    Kim, Jong S.; Teti, Nicholas M.

    2015-01-01

    The Magnetospheric Multiscale (MMS) mission is a Solar Terrestrial Probes mission comprising four identically instrumented spacecraft that will use Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes: magnetic reconnection, energetic particle acceleration, and turbulence. This paper presents the complete thermal balance (TB) test performed on the first of four observatories to go through thermal vacuum (TV) and the minibalance testing that was performed on the subsequent observatories to provide a comparison of all four. The TV and TB tests were conducted in a thermal vacuum chamber at the Naval Research Laboratory (NRL) in Washington, D.C. with the vacuum level higher than 1.3 x 10 (sup -4) pascals (10 (sup -6) torr) and the surrounding temperature achieving -180 degrees Centigrade. Three TB test cases were performed that included hot operational science, cold operational science and a cold survival case. In addition to the three balance cases a two hour eclipse and a four hour eclipse simulation was performed during the TV test to provide additional transient data points that represent the orbit in eclipse (or Earth's shadow) The goal was to perform testing such that the flight orbital environments could be simulated as closely as possible. A thermal model correlation between the thermal analysis and the test results was completed. Over 400 1-Wire temperature sensors, 200 thermocouples and 125 flight thermistor temperature sensors recorded data during TV and TB testing. These temperature versus time profiles and their agreements with the analytical results obtained using Thermal Desktop and SINDA/FLUINT are discussed. The model correlation for the thermal mathematical model (TMM) is conducted based on the numerical analysis results and the test data. The philosophy of model correlation was to correlate the model to within 3 degrees Centigrade of the test data using the standard deviation and mean deviation error

  10. Statistical model based gender prediction for targeted NGS clinical panels

    Directory of Open Access Journals (Sweden)

    Palani Kannan Kandavel

    2017-12-01

    The reference test dataset are being used to test the model. The sensitivity on predicting the gender has been increased from the current “genotype composition in ChrX” based approach. In addition, the prediction score given by the model can be used to evaluate the quality of clinical dataset. The higher prediction score towards its respective gender indicates the higher quality of sequenced data.

  11. A calibration mechanism based on the principles of the Michelson interferometer micro-thrust test device

    Science.gov (United States)

    Yan, Biao; Wang, Hai; Yang, Chunlai; Wen, Li

    2017-08-01

    A micro-thrust test system based on Michelson interferometer was proposed and tested. The relationship between thrust and output voltage of the calibration component in the system was calculated and verified with numerical modeling. The fitting function of the calibration component was obtained, which will be tested during future thrust test experiments.

  12. Oscillation-based test in mixed-signal circuits

    CERN Document Server

    Sánchez, Gloria Huertas; Rueda, Adoración Rueda

    2007-01-01

    This book presents the development and experimental validation of the structural test strategy called Oscillation-Based Test - OBT in short. The results presented here assert, not only from a theoretical point of view, but also based on a wide experimental support, that OBT is an efficient defect-oriented test solution, complementing the existing functional test techniques for mixed-signal circuits.

  13. On Two Mixture-Based Clustering Approaches Used in Modeling an Insurance Portfolio

    Directory of Open Access Journals (Sweden)

    Tatjana Miljkovic

    2018-05-01

    Full Text Available We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM and mixture-based clustering for an ordered stereotype model (OSM. The latter is for modeling of ordinal variables, and the former is for modeling losses as a function of mixed-type of covariates. The article extends the idea of mixture modeling to a multivariate classification for the purpose of testing unobserved heterogeneity in an insurance portfolio. The application of both methods is illustrated on a well-known French automobile portfolio, in which the model fitting is performed using the expectation-maximization (EM algorithm. Our findings show that these mixture-based clustering methods can be used to further test unobserved heterogeneity in an insurance portfolio and as such may be considered in insurance pricing, underwriting, and risk management.

  14. Testing the standard model

    International Nuclear Information System (INIS)

    Gordon, H.; Marciano, W.; Williams, H.H.

    1982-01-01

    We summarize here the results of the standard model group which has studied the ways in which different facilities may be used to test in detail what we now call the standard model, that is SU/sub c/(3) x SU(2) x U(1). The topics considered are: W +- , Z 0 mass, width; sin 2 theta/sub W/ and neutral current couplings; W + W - , Wγ; Higgs; QCD; toponium and naked quarks; glueballs; mixing angles; and heavy ions

  15. Test and application of a general process-based dynamic coastal mass-balance model for contaminants using data for radionuclides in the Dnieper-Bug estuary

    International Nuclear Information System (INIS)

    Hakanson, Lars; Lindgren, Dan

    2009-01-01

    In this work a general, process-based mass-balance model for water contaminants for coastal areas at the ecosystem scale (CoastMab) is presented and for the first time tested for radionuclides. The model is dynamic, based on ordinary differential equations and gives monthly predictions. Connected to the core model there is also a sub-model for contaminant concentrations in fish. CoastMab calculates sedimentation, resuspension, diffusion, mixing, burial and retention of the given contaminant. The model contains both general algorithms, which apply to all contaminants, and substance-specific parts (such as algorithms for the particulate fraction, diffusion, biouptake and biological half-life). CoastMab and the sub-model for fish are simple to apply in practice since all driving variables may be readily accessed from maps or regular monitoring programs. The separation between the surface-water layer and the deep-water layer is not done as in most traditional models from water temperature data but from sedimentological criteria. Previous versions of the models for phosphorus and suspended particulate matter (in the Baltic Sea) have been validated and shown to predict well. This work presents modifications of the model and tests using two tracers, radiocesium and radiostrontium (from the Chernobyl fallout) in the Dnieper-Bug estuary (the Black Sea). Good correlations are shown between modeled and empirical data, except for the month directly after the fallout. We have, e.g., shown that: 1. The conditions in the sea outside the bay are important for the concentrations of the substances in water, sediments and fish within the bay, 2. We have demonstrated 'biological,' 'chemical' and 'water' dilution, 3. That the water chemical conditions in the bay influence biouptake and concentrations in fish of the radionuclides and 4. That the feeding behaviour of the coastal fish is very important for the biouptake of the radionuclides

  16. lmerTest Package: Tests in Linear Mixed Effects Models

    DEFF Research Database (Denmark)

    Kuznetsova, Alexandra; Brockhoff, Per B.; Christensen, Rune Haubo Bojesen

    2017-01-01

    One of the frequent questions by users of the mixed model function lmer of the lme4 package has been: How can I get p values for the F and t tests for objects returned by lmer? The lmerTest package extends the 'lmerMod' class of the lme4 package, by overloading the anova and summary functions...... by providing p values for tests for fixed effects. We have implemented the Satterthwaite's method for approximating degrees of freedom for the t and F tests. We have also implemented the construction of Type I - III ANOVA tables. Furthermore, one may also obtain the summary as well as the anova table using...

  17. The implementation of assessment model based on character building to improve students’ discipline and achievement

    Science.gov (United States)

    Rusijono; Khotimah, K.

    2018-01-01

    The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.

  18. Real-time cavity simulator-based low-level radio-frequency test bench and applications for accelerators

    Science.gov (United States)

    Qiu, Feng; Michizono, Shinichiro; Miura, Takako; Matsumoto, Toshihiro; Liu, Na; Wibowo, Sigit Basuki

    2018-03-01

    A Low-level radio-frequency (LLRF) control systems is required to regulate the rf field in the rf cavity used for beam acceleration. As the LLRF system is usually complex, testing of the basic functions or control algorithms of this system in real time and in advance of beam commissioning is strongly recommended. However, the equipment necessary to test the LLRF system, such as superconducting cavities and high-power rf sources, is very expensive; therefore, we have developed a field-programmable gate array (FPGA)-based cavity simulator as a substitute for real rf cavities. Digital models of the cavity and other rf systems are implemented in the FPGA. The main components include cavity baseband models for the fundamental and parasitic modes, a mechanical model of the Lorentz force detuning, and a model of the beam current. Furthermore, in our simulator, the disturbance model used to simulate the power-supply ripples and microphonics is also carefully considered. Based on the presented cavity simulator, we have established an LLRF system test bench that can be applied to different cavity operational conditions. The simulator performance has been verified by comparison with real cavities in KEK accelerators. In this paper, the development and implementation of this cavity simulator is presented first, and the LLRF test bench based on the presented simulator is constructed. The results are then compared with those for KEK accelerators. Finally, several LLRF applications of the cavity simulator are illustrated.

  19. Real-time cavity simulator-based low-level radio-frequency test bench and applications for accelerators

    Directory of Open Access Journals (Sweden)

    Feng Qiu

    2018-03-01

    Full Text Available A Low-level radio-frequency (LLRF control systems is required to regulate the rf field in the rf cavity used for beam acceleration. As the LLRF system is usually complex, testing of the basic functions or control algorithms of this system in real time and in advance of beam commissioning is strongly recommended. However, the equipment necessary to test the LLRF system, such as superconducting cavities and high-power rf sources, is very expensive; therefore, we have developed a field-programmable gate array (FPGA-based cavity simulator as a substitute for real rf cavities. Digital models of the cavity and other rf systems are implemented in the FPGA. The main components include cavity baseband models for the fundamental and parasitic modes, a mechanical model of the Lorentz force detuning, and a model of the beam current. Furthermore, in our simulator, the disturbance model used to simulate the power-supply ripples and microphonics is also carefully considered. Based on the presented cavity simulator, we have established an LLRF system test bench that can be applied to different cavity operational conditions. The simulator performance has been verified by comparison with real cavities in KEK accelerators. In this paper, the development and implementation of this cavity simulator is presented first, and the LLRF test bench based on the presented simulator is constructed. The results are then compared with those for KEK accelerators. Finally, several LLRF applications of the cavity simulator are illustrated.

  20. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  1. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  2. Space Launch System Scale Model Acoustic Test Ignition Overpressure Testing

    Science.gov (United States)

    Nance, Donald; Liever, Peter; Nielsen, Tanner

    2015-01-01

    The overpressure phenomenon is a transient fluid dynamic event occurring during rocket propulsion system ignition. This phenomenon results from fluid compression of the accelerating plume gas, subsequent rarefaction, and subsequent propagation from the exhaust trench and duct holes. The high-amplitude unsteady fluid-dynamic perturbations can adversely affect the vehicle and surrounding structure. Commonly known as ignition overpressure (IOP), this is an important design-to environment for the Space Launch System (SLS) that NASA is currently developing. Subscale testing is useful in validating and verifying the IOP environment. This was one of the objectives of the Scale Model Acoustic Test, conducted at Marshall Space Flight Center. The test data quantifies the effectiveness of the SLS IOP suppression system and improves the analytical models used to predict the SLS IOP environments. The reduction and analysis of the data gathered during the SMAT IOP test series requires identification and characterization of multiple dynamic events and scaling of the event waveforms to provide the most accurate comparisons to determine the effectiveness of the IOP suppression systems. The identification and characterization of the overpressure events, the waveform scaling, the computation of the IOP suppression system knockdown factors, and preliminary comparisons to the analytical models are discussed.

  3. Space Launch System Scale Model Acoustic Test Ignition Overpressure Testing

    Science.gov (United States)

    Nance, Donald K.; Liever, Peter A.

    2015-01-01

    The overpressure phenomenon is a transient fluid dynamic event occurring during rocket propulsion system ignition. This phenomenon results from fluid compression of the accelerating plume gas, subsequent rarefaction, and subsequent propagation from the exhaust trench and duct holes. The high-amplitude unsteady fluid-dynamic perturbations can adversely affect the vehicle and surrounding structure. Commonly known as ignition overpressure (IOP), this is an important design-to environment for the Space Launch System (SLS) that NASA is currently developing. Subscale testing is useful in validating and verifying the IOP environment. This was one of the objectives of the Scale Model Acoustic Test (SMAT), conducted at Marshall Space Flight Center (MSFC). The test data quantifies the effectiveness of the SLS IOP suppression system and improves the analytical models used to predict the SLS IOP environments. The reduction and analysis of the data gathered during the SMAT IOP test series requires identification and characterization of multiple dynamic events and scaling of the event waveforms to provide the most accurate comparisons to determine the effectiveness of the IOP suppression systems. The identification and characterization of the overpressure events, the waveform scaling, the computation of the IOP suppression system knockdown factors, and preliminary comparisons to the analytical models are discussed.

  4. Recommendations for rheological testing and modelling of DWPF melter feed slurries

    International Nuclear Information System (INIS)

    Shadday, M.A. Jr.

    1994-08-01

    The melter feed in the DWPF process is a non-Newtonian slurry. In the melter feed system and the sampling system, this slurry is pumped at a wide range of flow rates through pipes of various diameters. Both laminar and turbulent flows are encountered. Good rheology models of the melter feed slurries are necessary for useful hydraulic models of the melter feed and sampling systems. A concentric cylinder viscometer is presently used to characterize the stress/strain rate behavior of the melter feed slurries, and provide the data for developing rheology models of the fluids. The slurries exhibit yield stresses, and they are therefore modelled as Bingham plastics. The ranges of strain rates covered by the viscometer tests fall far short of the entire laminar flow range, and therefore hydraulic modelling applications of the present rheology models frequently require considerable extrapolation beyond the range of the data base. Since the rheology models are empirical, this cannot be done with confidence in the validity of the results. Axial pressure drop versus flow rate measurements in a straight pipe can easily fill in the rest of the laminar flow range with stress/strain rate data. The two types of viscometer tests would be complementary, with the concentric cylinder viscometer providing accurate data at low strain rates, near the yield point if one exists, and pipe flow tests providing data at high strain rates up to and including the transition to turbulence. With data that covers the laminar flow range, useful rheological models can be developed. In the Bingham plastic model, linear behavior of the shear stress as a function of the strain rate is assumed once the yield stress is exceeded. Both shear thinning and shear thickening behavior have been observed in viscometer tests. Bingham plastic models cannot handle this non-linear behavior, but a slightly more complicated yield/power law model can

  5. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  6. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  7. Modelling the pile load test

    Directory of Open Access Journals (Sweden)

    Prekop Ľubomír

    2017-01-01

    Full Text Available This paper deals with the modelling of the load test of horizontal resistance of reinforced concrete piles. The pile belongs to group of piles with reinforced concrete heads. The head is pressed with steel arches of a bridge on motorway D1 Jablonov - Studenec. Pile model was created in ANSYS with several models of foundation having properties found out from geotechnical survey. Finally some crucial results obtained from computer models are presented and compared with these obtained from experiment.

  8. A family systems-based model of organizational intervention.

    Science.gov (United States)

    Shumway, Sterling T; Kimball, Thomas G; Korinek, Alan W; Arredondo, Rudy

    2007-04-01

    Employee assistance professionals are expected to be proficient at intervening in organizations and creating meaningful behavioral change in interpersonal functioning. Because of their training in family systems theories and concepts, marriage and family therapists (MFTs) are well suited to serve organizations as "systems consultants." Unfortunately, the authors were unable to identify any family systems-based models for organizational intervention that have been empirically tested and supported. In this article, the authors present a family systems-based model of intervention that they developed while working in an employee assistance program (EAP). They also present research that was used to refine the model and to provide initial support for its effectiveness.

  9. A sediment graph model based on SCS-CN method

    Science.gov (United States)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  10. Do sediment type and test durations affect results of laboratory-based, accelerated testing studies of permeable pavement clogging?

    Science.gov (United States)

    Nichols, Peter W B; White, Richard; Lucke, Terry

    2015-04-01

    Previous studies have attempted to quantify the clogging processes of Permeable Interlocking Concrete Pavers (PICPs) using accelerated testing methods. However, the results have been variable. This study investigated the effects that three different sediment types (natural and silica), and different simulated rainfall intensities, and testing durations had on the observed clogging processes (and measured surface infiltration rates) of laboratory-based, accelerated PICP testing studies. Results showed that accelerated simulated laboratory testing results are highly dependent on the type, and size of sediment used in the experiments. For example, when using real stormwater sediment up to 1.18 mm in size, the results showed that neither testing duration, nor stormwater application rate had any significant effect on PICP clogging. However, the study clearly showed that shorter testing durations generally increased clogging and reduced the surface infiltration rates of the models when artificial silica sediment was used. Longer testing durations also generally increased clogging of the models when using fine sediment (<300 μm). Results from this study will help researchers and designers better anticipate when and why PICPs are susceptible to clogging, reduce maintenance and extend the useful life of these increasingly common stormwater best management practices. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul S.; Morgan, Keith S.; Caffrey, Michael P.

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA user designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.

  12. The Couplex test cases: models and lessons

    International Nuclear Information System (INIS)

    Bourgeat, A.; Kern, M.; Schumacher, S.; Talandier, J.

    2003-01-01

    The Couplex test cases are a set of numerical test models for nuclear waste deep geological disposal simulation. They are centered around the numerical issues arising in the near and far field transport simulation. They were used in an international contest, and are now becoming a reference in the field. We present the models used in these test cases, and show sample results from the award winning teams. (authors)

  13. Impact of Learning Model Based on Cognitive Conflict toward Student’s Conceptual Understanding

    Science.gov (United States)

    Mufit, F.; Festiyed, F.; Fauzan, A.; Lufri, L.

    2018-04-01

    The problems that often occur in the learning of physics is a matter of misconception and low understanding of the concept. Misconceptions do not only happen to students, but also happen to college students and teachers. The existing learning model has not had much impact on improving conceptual understanding and remedial efforts of student misconception. This study aims to see the impact of cognitive-based learning model in improving conceptual understanding and remediating student misconceptions. The research method used is Design / Develop Research. The product developed is a cognitive conflict-based learning model along with its components. This article reports on product design results, validity tests, and practicality test. The study resulted in the design of cognitive conflict-based learning model with 4 learning syntaxes, namely (1) preconception activation, (2) presentation of cognitive conflict, (3) discovery of concepts & equations, (4) Reflection. The results of validity tests by some experts on aspects of content, didactic, appearance or language, indicate very valid criteria. Product trial results also show a very practical product to use. Based on pretest and posttest results, cognitive conflict-based learning models have a good impact on improving conceptual understanding and remediating misconceptions, especially in high-ability students.

  14. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    Science.gov (United States)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  15. Correlation between the model accuracy and model-based SOC estimation

    International Nuclear Information System (INIS)

    Wang, Qianqian; Wang, Jiao; Zhao, Pengju; Kang, Jianqiang; Yan, Few; Du, Changqing

    2017-01-01

    State-of-charge (SOC) estimation is a core technology for battery management systems. Considerable progress has been achieved in the study of SOC estimation algorithms, especially the algorithm on the basis of Kalman filter to meet the increasing demand of model-based battery management systems. The Kalman filter weakens the influence of white noise and initial error during SOC estimation but cannot eliminate the existing error of the battery model itself. As such, the accuracy of SOC estimation is directly related to the accuracy of the battery model. Thus far, the quantitative relationship between model accuracy and model-based SOC estimation remains unknown. This study summarizes three equivalent circuit lithium-ion battery models, namely, Thevenin, PNGV, and DP models. The model parameters are identified through hybrid pulse power characterization test. The three models are evaluated, and SOC estimation conducted by EKF-Ah method under three operating conditions are quantitatively studied. The regression and correlation of the standard deviation and normalized RMSE are studied and compared between the model error and the SOC estimation error. These parameters exhibit a strong linear relationship. Results indicate that the model accuracy affects the SOC estimation accuracy mainly in two ways: dispersion of the frequency distribution of the error and the overall level of the error. On the basis of the relationship between model error and SOC estimation error, our study provides a strategy for selecting a suitable cell model to meet the requirements of SOC precision using Kalman filter.

  16. The Clinical and Economic Benefits of Co-Testing Versus Primary HPV Testing for Cervical Cancer Screening: A Modeling Analysis.

    Science.gov (United States)

    Felix, Juan C; Lacey, Michael J; Miller, Jeffrey D; Lenhart, Gregory M; Spitzer, Mark; Kulkarni, Rucha

    2016-06-01

    Consensus United States cervical cancer screening guidelines recommend use of combination Pap plus human papillomavirus (HPV) testing for women aged 30 to 65 years. An HPV test was approved by the Food and Drug Administration in 2014 for primary cervical cancer screening in women age 25 years and older. Here, we present the results of clinical-economic comparisons of Pap plus HPV mRNA testing including genotyping for HPV 16/18 (co-testing) versus DNA-based primary HPV testing with HPV 16/18 genotyping and reflex cytology (HPV primary) for cervical cancer screening. A health state transition (Markov) model with 1-year cycling was developed using epidemiologic, clinical, and economic data from healthcare databases and published literature. A hypothetical cohort of one million women receiving triennial cervical cancer screening was simulated from ages 30 to 70 years. Screening strategies compared HPV primary to co-testing. Outcomes included total and incremental differences in costs, invasive cervical cancer (ICC) cases, ICC deaths, number of colposcopies, and quality-adjusted life years for cost-effectiveness calculations. Comprehensive sensitivity analyses were performed. In a simulation cohort of one million 30-year-old women modeled up to age 70 years, the model predicted that screening with HPV primary testing instead of co-testing could lead to as many as 2,141 more ICC cases and 2,041 more ICC deaths. In the simulation, co-testing demonstrated a greater number of lifetime quality-adjusted life years (22,334) and yielded $39.0 million in savings compared with HPV primary, thereby conferring greater effectiveness at lower cost. Model results demonstrate that co-testing has the potential to provide improved clinical and economic outcomes when compared with HPV primary. While actual cost and outcome data are evaluated, these findings are relevant to U.S. healthcare payers and women's health policy advocates seeking cost-effective cervical cancer screening

  17. Testing a hydraulic trait based model of stomatal control: results from a controlled drought experiment on aspen (Populus tremuloides, Michx.) and ponderosa pine (Pinus ponderosa, Douglas)

    Science.gov (United States)

    Love, D. M.; Venturas, M.; Sperry, J.; Wang, Y.; Anderegg, W.

    2017-12-01

    Modeling approaches for tree stomatal control often rely on empirical fitting to provide accurate estimates of whole tree transpiration (E) and assimilation (A), which are limited in their predictive power by the data envelope used to calibrate model parameters. Optimization based models hold promise as a means to predict stomatal behavior under novel climate conditions. We designed an experiment to test a hydraulic trait based optimization model, which predicts stomatal conductance from a gain/risk approach. Optimal stomatal conductance is expected to maximize the potential carbon gain by photosynthesis, and minimize the risk to hydraulic transport imposed by cavitation. The modeled risk to the hydraulic network is assessed from cavitation vulnerability curves, a commonly measured physiological trait in woody plant species. Over a growing season garden grown plots of aspen (Populus tremuloides, Michx.) and ponderosa pine (Pinus ponderosa, Douglas) were subjected to three distinct drought treatments (moderate, severe, severe with rehydration) relative to a control plot to test model predictions. Model outputs of predicted E, A, and xylem pressure can be directly compared to both continuous data (whole tree sapflux, soil moisture) and point measurements (leaf level E, A, xylem pressure). The model also predicts levels of whole tree hydraulic impairment expected to increase mortality risk. This threshold is used to estimate survivorship in the drought treatment plots. The model can be run at two scales, either entirely from climate (meteorological inputs, irrigation) or using the physiological measurements as a starting point. These data will be used to study model performance and utility, and aid in developing the model for larger scale applications.

  18. On school choice and test-based accountability.

    Directory of Open Access Journals (Sweden)

    Damian W. Betebenner

    2005-10-01

    Full Text Available Among the two most prominent school reform measures currently being implemented in The United States are school choice and test-based accountability. Until recently, the two policy initiatives remained relatively distinct from one another. With the passage of the No Child Left Behind Act of 2001 (NCLB, a mutualism between choice and accountability emerged whereby school choice complements test-based accountability. In the first portion of this study we present a conceptual overview of school choice and test-based accountability and explicate connections between the two that are explicit in reform implementations like NCLB or implicit within the market-based reform literature in which school choice and test-based accountability reside. In the second portion we scrutinize the connections, in particular, between school choice and test-based accountability using a large western school district with a popular choice system in place. Data from three sources are combined to explore the ways in which school choice and test-based accountability draw on each other: state assessment data of children in the district, school choice data for every participating student in the district choice program, and a parental survey of both participants and non-participants of choice asking their attitudes concerning the use of school report cards in the district. Results suggest that choice is of benefit academically to only the lowest achieving students, choice participation is not uniform across different ethnic groups in the district, and parents' primary motivations as reported on a survey for participation in choice are not due to test scores, though this is not consistent with choice preferences among parents in the district. As such, our results generally confirm the hypotheses of choice critics more so than advocates. Keywords: school choice; accountability; student testing.

  19. Cross-Mode Comparability of Computer-Based Testing (CBT) versus Paper-Pencil Based Testing (PPT): An Investigation of Testing Administration Mode among Iranian Intermediate EFL Learners

    Science.gov (United States)

    Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi

    2017-01-01

    Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…

  20. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    Science.gov (United States)

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  1. Linear Logistic Test Modeling with R

    Science.gov (United States)

    Baghaei, Purya; Kubinger, Klaus D.

    2015-01-01

    The present paper gives a general introduction to the linear logistic test model (Fischer, 1973), an extension of the Rasch model with linear constraints on item parameters, along with eRm (an R package to estimate different types of Rasch models; Mair, Hatzinger, & Mair, 2014) functions to estimate the model and interpret its parameters. The…

  2. A standard protocol for describing individual-based and agent-based models

    Science.gov (United States)

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  3. A Spline-Based Lack-Of-Fit Test for Independent Variable Effect in Poisson Regression.

    Science.gov (United States)

    Li, Chin-Shang; Tu, Wanzhu

    2007-05-01

    In regression analysis of count data, independent variables are often modeled by their linear effects under the assumption of log-linearity. In reality, the validity of such an assumption is rarely tested, and its use is at times unjustifiable. A lack-of-fit test is proposed for the adequacy of a postulated functional form of an independent variable within the framework of semiparametric Poisson regression models based on penalized splines. It offers added flexibility in accommodating the potentially non-loglinear effect of the independent variable. A likelihood ratio test is constructed for the adequacy of the postulated parametric form, for example log-linearity, of the independent variable effect. Simulations indicate that the proposed model performs well, and misspecified parametric model has much reduced power. An example is given.

  4. Team-Based Testing Improves Individual Learning

    Science.gov (United States)

    Vogler, Jane S.; Robinson, Daniel H.

    2016-01-01

    In two experiments, 90 undergraduates took six tests as part of an educational psychology course. Using a crossover design, students took three tests individually without feedback and then took the same test again, following the process of team-based testing (TBT), in teams in which the members reached consensus for each question and answered…

  5. A practical test for the choice of mixing distribution in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Bierlaire, Michel

    2007-01-01

    The choice of a specific distribution for random parameters of discrete choice models is a critical issue in transportation analysis. Indeed, various pieces of research have demonstrated that an inappropriate choice of the distribution may lead to serious bias in model forecast and in the estimated...... means of random parameters. In this paper, we propose a practical test, based on seminonparametric techniques. The test is analyzed both on synthetic and real data, and is shown to be simple and powerful. (c) 2007 Elsevier Ltd. All rights reserved....

  6. A 45-second self-test for cardiorespiratory fitness: heart rate-based estimation in healthy individuals

    NARCIS (Netherlands)

    Sartor, F.; Bonato, M.; Papini, G.; Bosio, A.; Mohammed, R.; Bonomi, A.G.; Moore, J.P.; Merati, G.; Della Torre, A.; Kubis, H.P.

    2016-01-01

    Cardio-respiratory fitness (CRF) is a widespread essential indicator in Sports Science as well as in Sports Medicine. This study aimed to develop and validate a prediction model for CRF based on a 45 second self-test, which can be conducted anywhere. Criterion validity, test re-test study was set up

  7. A test of inflated zeros for Poisson regression models.

    Science.gov (United States)

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  8. Kernel-based tests for joint independence

    DEFF Research Database (Denmark)

    Pfister, Niklas; Bühlmann, Peter; Schölkopf, Bernhard

    2018-01-01

    if the $d$ variables are jointly independent, as long as the kernel is characteristic. Based on an empirical estimate of dHSIC, we define three different non-parametric hypothesis tests: a permutation test, a bootstrap test and a test based on a Gamma approximation. We prove that the permutation test......We investigate the problem of testing whether $d$ random variables, which may or may not be continuous, are jointly (or mutually) independent. Our method builds on ideas of the two variable Hilbert-Schmidt independence criterion (HSIC) but allows for an arbitrary number of variables. We embed...... the $d$-dimensional joint distribution and the product of the marginals into a reproducing kernel Hilbert space and define the $d$-variable Hilbert-Schmidt independence criterion (dHSIC) as the squared distance between the embeddings. In the population case, the value of dHSIC is zero if and only...

  9. Validity evidence based on test content.

    Science.gov (United States)

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  10. Testing a bioenergetics-based habitat choice model: bluegill (Lepomis macrochirus) responses to food availability and temperature

    Science.gov (United States)

    2011-01-01

    Using an automated shuttlebox system, we conducted patch choice experiments with 32, 8–12 g bluegill sunfish (Lepomis macrochirus) to test a behavioral energetics hypothesis of habitat choice. When patch temperature and food levels were held constant within patches but different between patches, we expected bluegill to choose patches that maximized growth based on the bioenergetic integration of food and temperature as predicted by a bioenergetics model. Alternative hypotheses were that bluegill may choose patches based only on food (optimal foraging) or temperature (behavioral thermoregulation). The behavioral energetics hypothesis was not a good predictor of short-term (from minutes to weeks) patch choice by bluegill; the behavioral thermoregulation hypothesis was the best predictor. In the short-term, food and temperature appeared to affect patch choice hierarchically; temperature was more important, although food can alter temperature preference during feeding periods. Over a 19-d experiment, mean temperatures occupied by fish offered low rations did decline as predicted by the behavioral energetics hypothesis, but the decline was less than 1.0 °C as opposed to a possible 5 °C decline. A short-term, bioenergetic response to food and temperature may be precluded by physiological costs of acclimation not considered explicitly in the behavioral energetics hypothesis.

  11. Test Review: Test of English as a Foreign Language[TM]--Internet-Based Test (TOEFL iBT[R])

    Science.gov (United States)

    Alderson, J. Charles

    2009-01-01

    In this article, the author reviews the TOEFL iBT which is the latest version of the TOEFL, whose history stretches back to 1961. The TOEFL iBT was introduced in the USA, Canada, France, Germany and Italy in late 2005. Currently the TOEFL test is offered in two testing formats: (1) Internet-based testing (iBT); and (2) paper-based testing (PBT).…

  12. Model Integrated Problem Solving Based Learning pada Perkuliahan Dasar-dasar Kimia Analitik

    OpenAIRE

    Indarini Dwi Pursitasari; Anna Permanasari

    2013-01-01

    Abstract: Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL) model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questio...

  13. Model Integrated Problem Solving Based Learning Pada Perkuliahan Dasar-dasar Kimia Analitik

    OpenAIRE

    Pursitasari, Indarini Dwi; Permanasari, Anna

    2012-01-01

    : Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL) model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questionnaire o...

  14. Model-based monitoring of rotors with multiple coexisting faults

    International Nuclear Information System (INIS)

    Rossner, Markus

    2015-01-01

    Monitoring systems are applied to many rotors, but only few monitoring systems can separate coexisting errors and identify their quantity. This research project solves this problem using a combination of signal-based and model-based monitoring. The signal-based part performs a pre-selection of possible errors; these errors are further separated with model-based methods. This approach is demonstrated for the errors unbalance, bow, stator-fixed misalignment, rotor-fixed misalignment and roundness errors. For the model-based part, unambiguous error definitions and models are set up. The Ritz approach reduces the model order and therefore speeds up the diagnosis. Identification algorithms are developed for the different rotor faults. Hereto, reliable damage indicators and proper sub steps of the diagnosis have to be defined. For several monitoring problems, measuring both deflection and bearing force is very useful. The monitoring system is verified by experiments on an academic rotor test rig. The interpretation of the measurements requires much knowledge concerning the dynamics of the rotor. Due to the model-based approach, the system can separate errors with similar signal patterns and identify bow and roundness error online at operation speed. [de

  15. Modeling of the Jacked Pile Static Load Test with PLAX 3D

    Directory of Open Access Journals (Sweden)

    Tautvydas Statkus

    2016-12-01

    Full Text Available In this article jacked pile installation technology and its current processes, altering the base physical and mechanical characteristics are discussed. For the jacked pile static load test simulation Plax 3D software was selected, the opportunities and developments were described. Model building, materials, models, model geometry, finite elements, boundary conditions and assumptions adopted in addressing problems described in detail. Three different tasks formulated and load-settlement dependence a comparison of the results with the experiment given. Conclusions are formulated according to the modeling results.

  16. Measurement of Function Post Hip Fracture: Testing a Comprehensive Measurement Model of Physical Function.

    Science.gov (United States)

    Resnick, Barbara; Gruber-Baldini, Ann L; Hicks, Gregory; Ostir, Glen; Klinedinst, N Jennifer; Orwig, Denise; Magaziner, Jay

    2016-07-01

    Measurement of physical function post hip fracture has been conceptualized using multiple different measures. This study tested a comprehensive measurement model of physical function. This was a descriptive secondary data analysis including 168 men and 171 women post hip fracture. Using structural equation modeling, a measurement model of physical function which included grip strength, activities of daily living, instrumental activities of daily living, and performance was tested for fit at 2 and 12 months post hip fracture, and among male and female participants. Validity of the measurement model of physical function was evaluated based on how well the model explained physical activity, exercise, and social activities post hip fracture. The measurement model of physical function fit the data. The amount of variance the model or individual factors of the model explained varied depending on the activity. Decisions about the ideal way in which to measure physical function should be based on outcomes considered and participants. The measurement model of physical function is a reliable and valid method to comprehensively measure physical function across the hip fracture recovery trajectory. © 2015 Association of Rehabilitation Nurses.

  17. The Development of Writing Learning Model Based on the Arces Motivation for Students of Senior High School

    Directory of Open Access Journals (Sweden)

    Andreas Kosasih

    2014-08-01

    Full Text Available This research obtains some of the findings which in a word can be described as follows: (1 the step of Introduction (exploration: through study library and observation, it can be found that the quality of writing learning and the need of a better writing learning model, and it is formulated the prototype of writing learning model based on the ARCES motivation, serta dirumuskan prototipe model pembelajaran menulis berbasis motivasi ARCES after the draft is validated by the Indonesian language experts and education technology experts. (2 The step of model development: through development of preliminary model and development of  main model and after it is done by  monitoring, evaluation, focus group discussion and revision, then it is produced a better writing learning model based on ARCES motivation. (3 The step of model effectiveness examination: through pre-test, treatment, and post-test which is produced writing learning model  based on ARCES motivation. From the effectiveness test result of model, it can be concluded that writing learning based on ARCES motivation is more effective (in average value of post test is 83,94 than writing learning conventionally (in average value of post-test is 75,79.

  18. Semi-active control of magnetorheological elastomer base isolation system utilising learning-based inverse model

    Science.gov (United States)

    Gu, Xiaoyu; Yu, Yang; Li, Jianchun; Li, Yancheng

    2017-10-01

    Magnetorheological elastomer (MRE) base isolations have attracted considerable attention over the last two decades thanks to its self-adaptability and high-authority controllability in semi-active control realm. Due to the inherent nonlinearity and hysteresis of the devices, it is challenging to obtain a reasonably complicated mathematical model to describe the inverse dynamics of MRE base isolators and hence to realise control synthesis of the MRE base isolation system. Two aims have been achieved in this paper: i) development of an inverse model for MRE base isolator based on optimal general regression neural network (GRNN); ii) numerical and experimental validation of a real-time semi-active controlled MRE base isolation system utilising LQR controller and GRNN inverse model. The superiority of GRNN inverse model lays in fewer input variables requirement, faster training process and prompt calculation response, which makes it suitable for online training and real-time control. The control system is integrated with a three-storey shear building model and control performance of the MRE base isolation system is compared with bare building, passive-on isolation system and passive-off isolation system. Testing results show that the proposed GRNN inverse model is able to reproduce desired control force accurately and the MRE base isolation system can effectively suppress the structural responses when compared to the passive isolation system.

  19. Testing for direct genetic effects using a screening step in family-based association studies

    Directory of Open Access Journals (Sweden)

    Sharon M Lutz

    2013-11-01

    Full Text Available In genome wide association studies (GWAS, families based studies tend to have less power to detect genetic associations than population based studies, such as case-control studies. This can be an issue when testing if genes in a family based GWAS have a direct effect on the phenotype of interest or if the genes act indirectly through a secondary phenotype. When multiple SNPs are tested for a direct effect in the family based study, a screening step can be used to minimize the burden of multiple comparisons in the causal analysis. We propose a 2-stage screening step that can be incorporated into the family based association test (FBAT approach similar to the conditional mean model approach in the VanSteen-algorithm [1]. Simulations demonstrate that the type 1 error is preserved and this method is advantageous when multiple markers are tested. This method is illustrated by an application to the Framingham Heart Study.

  20. The Latent Class Model as a Measurement Model for Situational Judgment Tests

    Directory of Open Access Journals (Sweden)

    Frank Rijmen

    2011-11-01

    Full Text Available In a situational judgment test, it is often debatable what constitutes a correct answer to a situation. There is currently a multitude of scoring procedures. Establishing a measurement model can guide the selection of a scoring rule. It is argued that the latent class model is a good candidate for a measurement model. Two latent class models are applied to the Managing Emotions subtest of the Mayer, Salovey, Caruso Emotional Intelligence Test: a plain-vanilla latent class model, and a second-order latent class model that takes into account the clustering of several possible reactions within each hypothetical scenario of the situational judgment test. The results for both models indicated that there were three subgroups characterised by the degree to which differentiation occurred between possible reactions in terms of perceived effectiveness. Furthermore, the results for the second-order model indicated a moderate cluster effect.

  1. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  2. A 45-Second Self-Test for Cardiorespiratory Fitness: Heart Rate-Based Estimation in Healthy Individuals.

    Science.gov (United States)

    Sartor, Francesco; Bonato, Matteo; Papini, Gabriele; Bosio, Andrea; Mohammed, Rahil A; Bonomi, Alberto G; Moore, Jonathan P; Merati, Giampiero; La Torre, Antonio; Kubis, Hans-Peter

    2016-01-01

    Cardio-respiratory fitness (CRF) is a widespread essential indicator in Sports Science as well as in Sports Medicine. This study aimed to develop and validate a prediction model for CRF based on a 45 second self-test, which can be conducted anywhere. Criterion validity, test re-test study was set up to accomplish our objectives. Data from 81 healthy volunteers (age: 29 ± 8 years, BMI: 24.0 ± 2.9), 18 of whom females, were used to validate this test against gold standard. Nineteen volunteers repeated this test twice in order to evaluate its repeatability. CRF estimation models were developed using heart rate (HR) features extracted from the resting, exercise, and the recovery phase. The most predictive HR feature was the intercept of the linear equation fitting the HR values during the recovery phase normalized for the height2 (r2 = 0.30). The Ruffier-Dickson Index (RDI), which was originally developed for this squat test, showed a negative significant correlation with CRF (r = -0.40), but explained only 15% of the variability in CRF. A multivariate model based on RDI and sex, age and height increased the explained variability up to 53% with a cross validation (CV) error of 0.532 L ∙ min-1 and substantial repeatability (ICC = 0.91). The best predictive multivariate model made use of the linear intercept of HR at the beginning of the recovery normalized for height2 and age2; this had an adjusted r2 = 0. 59, a CV error of 0.495 L·min-1 and substantial repeatability (ICC = 0.93). It also had a higher agreement in classifying CRF levels (κ = 0.42) than RDI-based model (κ = 0.29). In conclusion, this simple 45 s self-test can be used to estimate and classify CRF in healthy individuals with moderate accuracy and large repeatability when HR recovery features are included.

  3. A 45-Second Self-Test for Cardiorespiratory Fitness: Heart Rate-Based Estimation in Healthy Individuals.

    Directory of Open Access Journals (Sweden)

    Francesco Sartor

    Full Text Available Cardio-respiratory fitness (CRF is a widespread essential indicator in Sports Science as well as in Sports Medicine. This study aimed to develop and validate a prediction model for CRF based on a 45 second self-test, which can be conducted anywhere. Criterion validity, test re-test study was set up to accomplish our objectives. Data from 81 healthy volunteers (age: 29 ± 8 years, BMI: 24.0 ± 2.9, 18 of whom females, were used to validate this test against gold standard. Nineteen volunteers repeated this test twice in order to evaluate its repeatability. CRF estimation models were developed using heart rate (HR features extracted from the resting, exercise, and the recovery phase. The most predictive HR feature was the intercept of the linear equation fitting the HR values during the recovery phase normalized for the height2 (r2 = 0.30. The Ruffier-Dickson Index (RDI, which was originally developed for this squat test, showed a negative significant correlation with CRF (r = -0.40, but explained only 15% of the variability in CRF. A multivariate model based on RDI and sex, age and height increased the explained variability up to 53% with a cross validation (CV error of 0.532 L ∙ min-1 and substantial repeatability (ICC = 0.91. The best predictive multivariate model made use of the linear intercept of HR at the beginning of the recovery normalized for height2 and age2; this had an adjusted r2 = 0. 59, a CV error of 0.495 L·min-1 and substantial repeatability (ICC = 0.93. It also had a higher agreement in classifying CRF levels (κ = 0.42 than RDI-based model (κ = 0.29. In conclusion, this simple 45 s self-test can be used to estimate and classify CRF in healthy individuals with moderate accuracy and large repeatability when HR recovery features are included.

  4. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    Science.gov (United States)

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  5. Uptake of Community-Based Peer Administered HIV Point-of-Care Testing: Findings from the PROUD Study.

    Directory of Open Access Journals (Sweden)

    Lisa Lazarus

    Full Text Available HIV prevalence among people who inject drugs (PWID in Ottawa is estimated at about 10%. The successful integration of peers into outreach efforts and wider access to HIV point-of-care testing (POCT create opportunities to explore the role of peers in providing HIV testing. The PROUD study, in partnership with Ottawa Public Health (OPH, sought to develop a model for community-based peer-administered HIV POCT.PROUD draws on community-based participatory research methods to better understand the HIV risk environment of people who use drugs in Ottawa. From March-October 2013, 593 people who reported injecting drugs or smoking crack cocaine were enrolled through street-based recruitment. Trained peer or medical student researchers administered a quantitative survey and offered an HIV POCT (bioLytical INSTI test to participants who did not self-report as HIV positive.550 (92.7% of the 593 participants were offered a POCT, of which 458 (83.3% consented to testing. Of those participants, 74 (16.2% had never been tested for HIV. There was no difference in uptake between testing offered by a peer versus a non-peer interviewer (OR = 1.05; 95% CI = 0.67-1.66. Despite testing those at high risk for HIV, only one new reactive test was identified.The findings from PROUD demonstrate high uptake of community-based HIV POCT. Peers were able to successfully provide HIV POCT and reach participants who had not previously been tested for HIV. Community-based and peer testing models provide important insights on ways to scale-up HIV prevention and testing among people who use drugs.

  6. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Model A 8.0 2.0 94.52% 88.46% 76 108 12 12 0.86 0.91 0.78 0.94. Model B 2.0 2.0 93.18% 89.33% 64 95 10 9 0.88 0.90 0.75 0.98. The above results for TEST – 1 show details for our two models (Model A and Model B).Performance of Model A after adding of 32 negative dataset of MiRTif on our testing set(MiRecords) ...

  7. Object-Oriented Modeling of an Energy Harvesting System Based on Thermoelectric Generators

    Science.gov (United States)

    Nesarajah, Marco; Frey, Georg

    This paper deals with the modeling of an energy harvesting system based on thermoelectric generators (TEG), and the validation of the model by means of a test bench. TEGs are capable to improve the overall energy efficiency of energy systems, e.g. combustion engines or heating systems, by using the remaining waste heat to generate electrical power. Previously, a component-oriented model of the TEG itself was developed in Modelica® language. With this model any TEG can be described and simulated given the material properties and the physical dimension. Now, this model was extended by the surrounding components to a complete model of a thermoelectric energy harvesting system. In addition to the TEG, the model contains the cooling system, the heat source, and the power electronics. To validate the simulation model, a test bench was built and installed on an oil-fired household heating system. The paper reports results of the measurements and discusses the validity of the developed simulation models. Furthermore, the efficiency of the proposed energy harvesting system is derived and possible improvements based on design variations tested in the simulation model are proposed.

  8. Horns Rev II, 2D-Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Frigaard, Peter

    This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU). The objective of the tests was: To investigate the combined influence of the pile diameter to water depth ratio and the wave height to water...... depth ratio on wave run-up of piles. The measurements should be used to design access platforms on piles. The Model tests include: Calibration of regular and irregular sea states at the location of the pile (without structure in place). Measurement of wave run-up for the calibrated sea states...... on the front side of the pile (0 to 90 degrees). These tests have been conducted at Aalborg University from 9. October, 2006 to 8. November, 2006. Unless otherwise mentioned, all values given in this report are in model scale....

  9. Development of a lifetime prediction model for lithium-ion batteries based on extended accelerated aging test data

    Science.gov (United States)

    Ecker, Madeleine; Gerschler, Jochen B.; Vogel, Jan; Käbitz, Stefan; Hust, Friedrich; Dechent, Philipp; Sauer, Dirk Uwe

    2012-10-01

    Battery lifetime prognosis is a key requirement for successful market introduction of electric and hybrid vehicles. This work aims at the development of a lifetime prediction approach based on an aging model for lithium-ion batteries. A multivariable analysis of a detailed series of accelerated lifetime experiments representing typical operating conditions in hybrid electric vehicle is presented. The impact of temperature and state of charge on impedance rise and capacity loss is quantified. The investigations are based on a high-power NMC/graphite lithium-ion battery with good cycle lifetime. The resulting mathematical functions are physically motivated by the occurring aging effects and are used for the parameterization of a semi-empirical aging model. An impedance-based electric-thermal model is coupled to the aging model to simulate the dynamic interaction between aging of the battery and the thermal as well as electric behavior. Based on these models different drive cycles and management strategies can be analyzed with regard to their impact on lifetime. It is an important tool for vehicle designers and for the implementation of business models. A key contribution of the paper is the parameterization of the aging model by experimental data, while aging simulation in the literature usually lacks a robust empirical foundation.

  10. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  11. Model tests on dynamic performance of RC shear walls

    International Nuclear Information System (INIS)

    Nagashima, Toshio; Shibata, Akenori; Inoue, Norio; Muroi, Kazuo.

    1991-01-01

    For the inelastic dynamic response analysis of a reactor building subjected to earthquakes, it is essentially important to properly evaluate its restoring force characteristics under dynamic loading condition and its damping performance. Reinforced concrete shear walls are the main structural members of a reactor building, and dominate its seismic behavior. In order to obtain the basic information on the dynamic restoring force characteristics and damping performance of shear walls, the dynamic test using a large shaking table, static displacement control test and the pseudo-dynamic test on the models of a shear wall were conducted. In the dynamic test, four specimens were tested on a large shaking table. In the static test, four specimens were tested, and in the pseudo-dynamic test, three specimens were tested. These tests are outlined. The results of these tests were compared, placing emphasis on the restoring force characteristics and damping performance of the RC wall models. The strength was higher in the dynamic test models than in the static test models mainly due to the effect of loading rate. (K.I.)

  12. Adaptive MPC based on MIMO ARX-Laguerre model.

    Science.gov (United States)

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Measuring Japanese EFL Student Perceptions of Internet-Based Tests with the Technology Acceptance Model

    Science.gov (United States)

    Dizon, Gilbert

    2016-01-01

    The Internet has made it possible for teachers to administer online assessments with affordability and ease. However, little is known about Japanese English as a Foreign Language (EFL) students' attitudes of internet-based tests (IBTs). Therefore, this study aimed to measure the perceptions of IBTs among Japanese English language learners with the…

  14. THE MISHKIN TEST: AN ANALYSIS OF MODEL EXTENSIONS

    Directory of Open Access Journals (Sweden)

    Diana MURESAN

    2015-04-01

    Full Text Available This paper reviews empirical research that apply Mishkin test for the examination of the existence of accruals anomaly using alternative approaches. Mishkin test is a test used in macro-econometrics for rational hypothesis, which test for the market efficiency. Starting with Sloan (1996 the model has been applied to accruals anomaly literature. Since Sloan (1996, the model has known various improvements and it has been the subject to many debates in the literature regarding its efficacy. Nevertheless, the current evidence strengthens the pervasiveness of the model. The analyses realized on the extended studies on Mishkin test highlights that adding additional variables enhances the results, providing insightful information about the occurrence of accruals anomaly.

  15. Model Integrated Problem Solving Based Learning pada Perkuliahan Dasar-dasar Kimia Analitik

    Directory of Open Access Journals (Sweden)

    Indarini Dwi Pursitasari

    2013-07-01

    Full Text Available Abstract: Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questionnaire on the students’opinions on the use of IPSBL model. The quantitative data were analyzed using t-test and one-way ANOVA, and the qualitative data were analyzed by counting the percentage. The results of the study show that the implementation of IPSBL model increased the problem solving skills and cognitive ability of the pre-service teachers . The model was also responded positively by the research subjects. Abstrak: Model Integrated Problem Solving Based learning pada Perkuliahan Dasar-dasar Kimia Analitik. Penelitian ini bertujuan menentukan pengaruh model Integrated Problem Solving Based Learning(IPSBL terhadap peningkatan kemampuan problem solving dan kemampuan kognitif mahasiswa calon guru. Subjek penelitian terdiri dari 21 mahasiswa kelas eksperimen dan 20 mahasiswa kelas kontrol. Data dikumpulkan menggunakan tes kemampuan problem solving, tes kemampuan kognitif, dan angket untuk menjaring pendapat mahasiswa terhadap penggunaan model IPSBL . Data kuantitatif dianalisis denga n uji- t dan Anava dengan bantuan program SPSS 16.0. Data kualitatif dihitung persentasenya. Hasil penelitian menunjukkan bahwa model IPSBL dapat meningkatkan kemampuan problem solving dan kemampuan kognitif serta mendapat tanggapan yang positif dari mahasiswa.

  16. USB environment measurements based on full-scale static engine ground tests. [Upper Surface Blowing for YC-14

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.

  17. Model Testing - Bringing the Ocean into the Laboratory

    DEFF Research Database (Denmark)

    Aage, Christian

    2000-01-01

    Hydrodynamic model testing, the principle of bringing the ocean into the laboratory to study the behaviour of the ocean itself and the response of man-made structures in the ocean in reduced scale, has been known for centuries. Due to an insufficient understanding of the physics involved, however......, the early model tests often gave incomplete or directly misleading results.This keynote lecture deals with some of the possibilities and problems within the field of hydrodynamic and hydraulic model testing....

  18. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    Science.gov (United States)

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions.

  19. Output-Feedback Model Predictive Control of a Pasteurization Pilot Plant based on an LPV model

    Science.gov (United States)

    Karimi Pour, Fatemeh; Ocampo-Martinez, Carlos; Puig, Vicenç

    2017-01-01

    This paper presents a model predictive control (MPC) of a pasteurization pilot plant based on an LPV model. Since not all the states are measured, an observer is also designed, which allows implementing an output-feedback MPC scheme. However, the model of the plant is not completely observable when augmented with the disturbance models. In order to solve this problem, the following strategies are used: (i) the whole system is decoupled into two subsystems, (ii) an inner state-feedback controller is implemented into the MPC control scheme. A real-time example based on the pasteurization pilot plant is simulated as a case study for testing the behavior of the approaches.

  20. Modelling of the spallation reaction: analysis and testing of nuclear models; Simulation de la spallation: analyse et test des modeles nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Toccoli, C

    2000-04-03

    The spallation reaction is considered as a 2-step process. First a very quick stage (10{sup -22}, 10{sup -29} s) which corresponds to the individual interaction between the incident projectile and nucleons, this interaction is followed by a series of nucleon-nucleon collisions (intranuclear cascade) during which fast particles are emitted, the nucleus is left in a strongly excited level. Secondly a slower stage (10{sup -18}, 10{sup -19} s) during which the nucleus is expected to de-excite completely. This de-excitation is performed by evaporation of light particles (n, p, d, t, {sup 3}He, {sup 4}He) or/and fission or/and fragmentation. The HETC code has been designed to simulate spallation reactions, this simulation is based on the 2-steps process and on several models of intranuclear cascades (Bertini model, Cugnon model, Helder Duarte model), the evaporation model relies on the statistical theory of Weiskopf-Ewing. The purpose of this work is to evaluate the ability of the HETC code to predict experimental results. A methodology about the comparison of relevant experimental data with results of calculation is presented and a preliminary estimation of the systematic error of the HETC code is proposed. The main problem of cascade models originates in the difficulty of simulating inelastic nucleon-nucleon collisions, the emission of pions is over-estimated and corresponding differential spectra are badly reproduced. The inaccuracy of cascade models has a great impact to determine the excited level of the nucleus at the end of the first step and indirectly on the distribution of final residual nuclei. The test of the evaporation model has shown that the emission of high energy light particles is under-estimated. (A.C.)

  1. The Model Identification Test: A Limited Verbal Science Test

    Science.gov (United States)

    McIntyre, P. J.

    1972-01-01

    Describes the production of a test with a low verbal load for use with elementary school science students. Animated films were used to present appropriate and inappropriate models of the behavior of particles of matter. (AL)

  2. Low-order model of the Loss-of-Fluid Test (LOFT) reactor plant for use in Kalman filter-based optimal estimators

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1980-01-01

    A low-order, nonlinear model of the Loss-of-Fluid Test (LOFT) reactor plant, for use in Kalman filter estimators, is developed, described, and evaluated. This model consists of 31 differential equations and represents all major subsystems of both the primary and secondary sides of the LOFT plant. Comparisons between model calculations and available LOFT power range testing transients demonstrate the accuracy of the low-order model. The nonlinear model is numerically linearized for future implementation in Kalman filter and optimal control algorithms. The linearized model is shown to be an adequate representation of the nonlinear plant dynamics

  3. Correlations between power and test reactor data bases

    International Nuclear Information System (INIS)

    Guthrie, G.L.; Simonen, E.P.

    1989-02-01

    Differences between power reactor and test reactor data bases have been evaluated. Charpy shift data has been assembled from specimens irradiated in both high-flux test reactors and low-flux power reactors. Preliminary tests for the existence of a bias between test and power reactor data bases indicate a possible bias between the weld data bases. The bias is nonconservative for power predictive purposes, using test reactor data. The lesser shift for test reactor data compared to power reactor data is interpreted primarily in terms of greater point defect recombination for test reactor fluxes compared to power reactor fluxes. The possibility of greater thermal aging effects during lower damage rates is also discussed. 15 refs., 5 figs., 2 tabs

  4. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    Science.gov (United States)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  5. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  6. Improving Conceptual Understanding and Representation Skills Through Excel-Based Modeling

    Science.gov (United States)

    Malone, Kathy L.; Schunn, Christian D.; Schuchardt, Anita M.

    2018-02-01

    The National Research Council framework for science education and the Next Generation Science Standards have developed a need for additional research and development of curricula that is both technologically model-based and includes engineering practices. This is especially the case for biology education. This paper describes a quasi-experimental design study to test the effectiveness of a model-based curriculum focused on the concepts of natural selection and population ecology that makes use of Excel modeling tools (Modeling Instruction in Biology with Excel, MBI-E). The curriculum revolves around the bio-engineering practice of controlling an invasive species. The study takes place in the Midwest within ten high schools teaching a regular-level introductory biology class. A post-test was designed that targeted a number of common misconceptions in both concept areas as well as representational usage. The results of a post-test demonstrate that the MBI-E students significantly outperformed the traditional classes in both natural selection and population ecology concepts, thus overcoming a number of misconceptions. In addition, implementing students made use of more multiple representations as well as demonstrating greater fascination for science.

  7. Modeling of a Parabolic Trough Solar Field for Acceptance Testing: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.; Mehos, M. S.; Kearney, D. W.; McMahan, A. C.

    2011-01-01

    As deployment of parabolic trough concentrating solar power (CSP) systems ramps up, the need for reliable and robust performance acceptance test guidelines for the solar field is also amplified. Project owners and/or EPC contractors often require extensive solar field performance testing as part of the plant commissioning process in order to ensure that actual solar field performance satisfies both technical specifications and performance guaranties between the involved parties. Performance test code work is currently underway at the National Renewable Energy Laboratory (NREL) in collaboration with the SolarPACES Task-I activity, and within the ASME PTC-52 committee. One important aspect of acceptance testing is the selection of a robust technology performance model. NREL1 has developed a detailed parabolic trough performance model within the SAM software tool. This model is capable of predicting solar field, sub-system, and component performance. It has further been modified for this work to support calculation at subhourly time steps. This paper presents the methodology and results of a case study comparing actual performance data for a parabolic trough solar field to the predicted results using the modified SAM trough model. Due to data limitations, the methodology is applied to a single collector loop, though it applies to larger subfields and entire solar fields. Special consideration is provided for the model formulation, improvements to the model formulation based on comparison with the collected data, and uncertainty associated with the measured data. Additionally, this paper identifies modeling considerations that are of particular importance in the solar field acceptance testing process and uses the model to provide preliminary recommendations regarding acceptable steady-state testing conditions at the single-loop level.

  8. A user interface for the Kansas Geological Survey slug test model.

    Science.gov (United States)

    Esling, Steven P; Keller, John E

    2009-01-01

    The Kansas Geological Survey (KGS) developed a semianalytical solution for slug tests that incorporates the effects of partial penetration, anisotropy, and the presence of variable conductivity well skins. The solution can simulate either confined or unconfined conditions. The original model, written in FORTRAN, has a text-based interface with rigid input requirements and limited output options. We re-created the main routine for the KGS model as a Visual Basic macro that runs in most versions of Microsoft Excel and built a simple-to-use Excel spreadsheet interface that automatically displays the graphical results of the test. A comparison of the output from the original FORTRAN code to that of the new Excel spreadsheet version for three cases produced identical results.

  9. Wave Reflection Model Tests

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Larsen, Brian Juul

    The investigation concerns the design of a new internal breakwater in the main port of Ibiza. The objective of the model tests was in the first hand to optimize the cross section to make the wave reflection low enough to ensure that unacceptable wave agitation will not occur in the port. Secondly...

  10. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  11. A Model of Statistics Performance Based on Achievement Goal Theory.

    Science.gov (United States)

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  12. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    Science.gov (United States)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  13. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  14. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  15. Acceptability and feasibility of a social entrepreneurship testing model to promote HIV self-testing and linkage to care among men who have sex with men.

    Science.gov (United States)

    Zhong, F; Tang, W; Cheng, W; Lin, P; Wu, Q; Cai, Y; Tang, S; Fan, L; Zhao, Y; Chen, X; Mao, J; Meng, G; Tucker, J D; Xu, H

    2017-05-01

    HIV self-testing (HIVST) offers an opportunity to increase HIV testing among people not reached by facility-based services. However, the promotion of HIVST is limited as a consequence of insufficient community engagement. We built a social entrepreneurship testing (SET) model to promote HIVST linkage to care among Chinese men who have sex with men (MSM) in Guangzhou. The SET model includes a few key steps. Each participant first completed an online survey, and paid a US$23 (refundable) deposit to receive an HIVST kit and a syphilis self-testing (SST) kit. After the testing, the results were sent to the platform by the participants and interpreted by Center for Disease Control and Prevention (CDC) staff. Meanwhile, the deposit was returned to each participant. Finally, the Community based organizations (CBO) contacted the participants to provide counselling services, confirmation testing and linkage to care. During April-June 2015, a total of 198 MSM completed a preliminary survey and purchased self-testing kits. The majority were aged < 34 years (84.4%) and met partners online (93.1%). In addition, 68.9% of participants had ever been tested for HIV, and 19.5% had ever performed HIVST. Overall, feedback was received from 192 participants (97.0%). Of these participants, 14 people did not use the kits; among those who did use the kits, the HIV and syphilis prevalences were 4.5% (eight of 178) and 3.7% (six of 178), respectively. All of the screened HIV-positive individuals sought further confirmation testing and were linked to care. Using an online SET model to promote HIV and syphilis self-testing among Chinese MSM is acceptable and feasible, and this model adds a new testing platform to the current testing service system. © 2016 British HIV Association.

  16. Fat Tail Model for Simulating Test Systems in Multiperiod Unit Commitment

    Directory of Open Access Journals (Sweden)

    J. A. Marmolejo

    2015-01-01

    Full Text Available This paper describes the use of Chambers-Mallows-Stuck method for simulating stable random variables in the generation of test systems for economic analysis in power systems. A study that focused on generating test electrical systems through fat tail model for unit commitment problem in electrical power systems is presented. Usually, the instances of test systems in Unit Commitment are generated using normal distribution, but in this work, simulations data are based on a new method. For simulating, we used three original systems to obtain the demand behavior and thermal production costs. The estimation of stable parameters for the simulation of stable random variables was based on three generally accepted methods: (a regression, (b quantiles, and (c maximum likelihood, choosing one that has the best fit of the tails of the distribution. Numerical results illustrate the applicability of the proposed method by solving several unit commitment problems.

  17. Analysing model fit of psychometric process models: An overview, a new test and an application to the diffusion model.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2017-05-01

    Cognitive psychometric models embed cognitive process models into a latent trait framework in order to allow for individual differences. Due to their close relationship to the response process the models allow for profound conclusions about the test takers. However, before such a model can be used its fit has to be checked carefully. In this manuscript we give an overview over existing tests of model fit and show their relation to the generalized moment test of Newey (Econometrica, 53, 1985, 1047) and Tauchen (J. Econometrics, 30, 1985, 415). We also present a new test, the Hausman test of misspecification (Hausman, Econometrica, 46, 1978, 1251). The Hausman test consists of a comparison of two estimates of the same item parameters which should be similar if the model holds. The performance of the Hausman test is evaluated in a simulation study. In this study we illustrate its application to two popular models in cognitive psychometrics, the Q-diffusion model and the D-diffusion model (van der Maas, Molenaar, Maris, Kievit, & Boorsboom, Psychol Rev., 118, 2011, 339; Molenaar, Tuerlinckx, & van der Maas, J. Stat. Softw., 66, 2015, 1). We also compare the performance of the test to four alternative tests of model fit, namely the M 2 test (Molenaar et al., J. Stat. Softw., 66, 2015, 1), the moment test (Ranger et al., Br. J. Math. Stat. Psychol., 2016) and the test for binned time (Ranger & Kuhn, Psychol. Test. Asess. , 56, 2014b, 370). The simulation study indicates that the Hausman test is superior to the latter tests. The test closely adheres to the nominal Type I error rate and has higher power in most simulation conditions. © 2017 The British Psychological Society.

  18. Numerical simulations of rubber bearing tests and shaking table tests

    International Nuclear Information System (INIS)

    Hirata, K.; Matsuda, A.; Yabana, S.

    2002-01-01

    Test data concerning rubber bearing tests and shaking table tests of base-isolated model conducted by CRIEPI are provided to the participants of Coordinated Research Program (CRP) on 'Intercomparison of Analysis Methods for predicting the behaviour of Seismically Isolated Nuclear Structure', which is organized by International Atomic Energy Agency (IAEA), for the comparison study of numerical simulation of base-isolated structure. In this paper outlines of the test data provided and the numerical simulations of bearing tests and shaking table tests are described. Using computer code ABAQUS, numerical simulations of rubber bearing tests are conducted for NRBs, LRBs (data provided by CRIEPI) and for HDRs (data provided by ENEA/ENEL and KAERI). Several strain energy functions are specified according to the rubber material test corresponding to each rubber bearing. As for lead plug material in LRB, mechanical characteristics are reevaluated and are made use of. Simulation results for these rubber bearings show satisfactory agreement with the test results. Shaking table test conducted by CRIEPI is of a base isolated rigid mass supported by LRB. Acceleration time histories, displacement time histories of the isolators as well as cyclic loading test data of the LRB used for the shaking table test are provided to the participants of the CRP. Simulations of shaking table tests are conducted for this rigid mass, and also for the steel frame model which is conducted by ENEL/ENEA. In the simulation of the rigid mass model test, where LRBs are used, isolators are modeled either by bilinear model or polylinear model. In both cases of modeling of isolators, simulation results show good agreement with the test results. In the case of the steel frame model, where HDRs are used as isolators, bilinear model and polylinear model are also used for modeling isolators. The response of the model is simulated comparatively well in the low frequency range of the floor response, however, in

  19. EVALUATION OF REINFORCING EFFECT ON FACEBOLTS FOR TUNNELING USING X-RAY CT AND CENTRIFUGE MODEL TEST

    Science.gov (United States)

    Takano, Daiki; Otani, Jun; Date, Kensuke; Yokot, Yasuhiro; Nagatani, Hideki

    The purpose of this paper is firstly to simulate the tunnel face failure in laboratory with four cases of model tests by pulling out tunnel model from a sandy ground that are without using auxiliary method nor facebolts and using facebolts with three different lengths of bolts, and secondary, to investigate the behavior of model ground using X-ray computed tomography (CT) scanner to visualize the failure zone in three dimensions. In addition to those results, a series of centrifuge model tests are conducted to confirm the results of X-ray CT test and also to discuss the ground behavior under full scale stress level. Finally, the effect of face bolting method is evaluated based on all the test results.

  20. Kinematic tests of exotic flat cosmological models

    International Nuclear Information System (INIS)

    Charlton, J.C.; Turner, M.S.; NASA/Fermilab Astrophysics Center, Batavia, IL)

    1987-01-01

    Theoretical prejudice and inflationary models of the very early universe strongly favor the flat, Einstein-de Sitter model of the universe. At present the observational data conflict with this prejudice. This conflict can be resolved by considering flat models of the universe which posses a smooth component of energy density. The kinematics of such models, where the smooth component is relativistic particles, a cosmological term, a network of light strings, or fast-moving, light strings is studied in detail. The observational tests which can be used to discriminate between these models are also discussed. These tests include the magnitude-redshift, lookback time-redshift, angular size-redshift, and comoving volume-redshift diagrams and the growth of density fluctuations. 58 references

  1. Kinematic tests of exotic flat cosmological models

    International Nuclear Information System (INIS)

    Charlton, J.C.; Turner, M.S.

    1986-05-01

    Theoretical prejudice and inflationary models of the very early Universe strongly favor the flat, Einstein-deSitter model of the Universe. At present the observational data conflict with this prejudice. This conflict can be resolved by considering flat models of the Universe which possess a smooth component by energy density. We study in detail the kinematics of such models, where the smooth component is relativistic particles, a cosmological term, a network of light strings, or fast-moving, light strings. We also discuss the observational tests which can be used to discriminate between these models. These tests include the magnitude-redshift, lookback time-redshift, angular size-redshift, and comoving volume-redshift diagrams and the growth of density fluctuations

  2. Kinematic tests of exotic flat cosmological models

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, J.C.; Turner, M.S.

    1986-05-01

    Theoretical prejudice and inflationary models of the very early Universe strongly favor the flat, Einstein-deSitter model of the Universe. At present the observational data conflict with this prejudice. This conflict can be resolved by considering flat models of the Universe which possess a smooth component by energy density. We study in detail the kinematics of such models, where the smooth component is relativistic particles, a cosmological term, a network of light strings, or fast-moving, light strings. We also discuss the observational tests which can be used to discriminate between these models. These tests include the magnitude-redshift, lookback time-redshift, angular size-redshift, and comoving volume-redshift diagrams and the growth of density fluctuations.

  3. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    Science.gov (United States)

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  4. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  5. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    2001-01-01

    A model for constrained computerized adaptive testing is proposed in which the information on the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  6. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    1997-01-01

    A model for constrained computerized adaptive testing is proposed in which the information in the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  7. Ethernet-based test stand for a CAN network

    Science.gov (United States)

    Ziebinski, Adam; Cupek, Rafal; Drewniak, Marek

    2017-11-01

    This paper presents a test stand for the CAN-based systems that are used in automotive systems. The authors propose applying an Ethernet-based test system that supports the virtualisation of a CAN network. The proposed solution has many advantages compared to classical test beds that are based on dedicated CAN-PC interfaces: it allows the physical constraints associated with the number of interfaces that can be simultaneously connected to a tested system to be avoided, which enables the test time for parallel tests to be shortened; the high speed of Ethernet transmission allows for more frequent sampling of the messages that are transmitted by a CAN network (as the authors show in the experiment results section) and the cost of the proposed solution is much lower than the traditional lab-based dedicated CAN interfaces for PCs.

  8. Initial virtual flight test for a dynamically similar aircraft model with control augmentation system

    Directory of Open Access Journals (Sweden)

    Linliang Guo

    2017-04-01

    Full Text Available To satisfy the validation requirements of flight control law for advanced aircraft, a wind tunnel based virtual flight testing has been implemented in a low speed wind tunnel. A 3-degree-of-freedom gimbal, ventrally installed in the model, was used in conjunction with an actively controlled dynamically similar model of aircraft, which was equipped with the inertial measurement unit, attitude and heading reference system, embedded computer and servo-actuators. The model, which could be rotated around its center of gravity freely by the aerodynamic moments, together with the flow field, operator and real time control system made up the closed-loop testing circuit. The model is statically unstable in longitudinal direction, and it can fly stably in wind tunnel with the function of control augmentation of the flight control laws. The experimental results indicate that the model responds well to the operator’s instructions. The response of the model in the tests shows reasonable agreement with the simulation results. The difference of response of angle of attack is less than 0.5°. The effect of stability augmentation and attitude control law was validated in the test, meanwhile the feasibility of virtual flight test technique treated as preliminary evaluation tool for advanced flight vehicle configuration research was also verified.

  9. Test results for three prototype models of a linear induction launcher

    International Nuclear Information System (INIS)

    Zabar, Z.; Lu, X.N.; He, J.L.; Birenbaum, L.; Levi, E.; Kuznetsov, S.B.; Nahemow, M.D.

    1991-01-01

    This paper reports on the work on the linear induction launcher (LIL) started with an analytical study tht was followed by computer simulations and then was tested by laboratory models. Two mathematical representations have been developed to describe the launcher. The first, based on the field approach with sinusoidal excitation, has been validated by static tests on a small scale prototype fed at constant current and variable frequency. The second, a transient representation using computer simulation allows consideration of energization by means of a capacitor bank and a power conditioner. Tests performed on three small-scale prototypes up to 100 m/s muzzle velocities show good agreement with predicted performance

  10. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  11. A Model of Intelligent Fault Diagnosis of Power Equipment Based on CBR

    Directory of Open Access Journals (Sweden)

    Gang Ma

    2015-01-01

    Full Text Available Nowadays the demand of power supply reliability has been strongly increased as the development within power industry grows rapidly. Nevertheless such large demand requires substantial power grid to sustain. Therefore power equipment’s running and testing data which contains vast information underpins online monitoring and fault diagnosis to finally achieve state maintenance. In this paper, an intelligent fault diagnosis model for power equipment based on case-based reasoning (IFDCBR will be proposed. The model intends to discover the potential rules of equipment fault by data mining. The intelligent model constructs a condition case base of equipment by analyzing the following four categories of data: online recording data, history data, basic test data, and environmental data. SVM regression analysis was also applied in mining the case base so as to further establish the equipment condition fingerprint. The running data of equipment can be diagnosed by such condition fingerprint to detect whether there is a fault or not. Finally, this paper verifies the intelligent model and three-ratio method based on a set of practical data. The resulting research demonstrates that this intelligent model is more effective and accurate in fault diagnosis.

  12. An Industrial Model Based Disturbance Feedback Control Scheme

    DEFF Research Database (Denmark)

    Kawai, Fukiko; Nakazawa, Chikashi; Vinther, Kasper

    2014-01-01

    This paper presents a model based disturbance feedback control scheme. Industrial process systems have been traditionally controlled by using relay and PID controller. However these controllers are affected by disturbances and model errors and these effects degrade control performance. The authors...... propose a new control method that can decrease the negative impact of disturbance and model errors. The control method is motivated by industrial practice by Fuji Electric. Simulation tests are examined with a conventional PID controller and the disturbance feedback control. The simulation results...

  13. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “ any fall ” and “ recurrent falls .” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  14. Seismic Response Analysis and Test of 1/8 Scale Model for a Spent Fuel Storage Cask

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Han; Park, C. G.; Koo, G. H.; Seo, G. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yeom, S. H. [Chungnam Univ., Daejeon (Korea, Republic of); Choi, B. I.; Cho, Y. D. [Korea Hydro and Nuclear Power Co. Ltd., Daejeon (Korea, Republic of)

    2005-07-15

    The seismic response tests of a spent fuel dry storage cask model of 1/8 scale are performed for an typical 1940 El-centro and Kobe earthquakes. This report firstly focuses on the data generation by seismic response tests of a free standing storage cask model to check the overturing possibility of a storage cask and the slipping displacement on concrete slab bed. The variations in seismic load magnitude and cask/bed interface friction are considered in tests. The test results show that the model gives an overturning response for an extreme condition only. A FEM model is built for the test model of 1/8 scale spent fuel dry storage cask using available 3D contact conditions in ABAQUS/Explicit. Input load for this analysis is El-centro earthquake, and the friction coefficients are obtained from the test result. Penalty and kinematic contact methods of ABAQUS are used for a mechanical contact formulation. The analysis methods was verified with the rocking angle obtained by seismic response tests. The kinematic contact method with an adequate normal contact stiffness showed a good agreement with tests. Based on the established analysis method for 1/8 scale model, the seismic response analyses of a full scale model are performed for design and beyond design seismic loads.

  15. Deformation Monitoring of Geomechanical Model Test and Its Application in Overall Stability Analysis of a High Arch Dam

    Directory of Open Access Journals (Sweden)

    Baoquan Yang

    2015-01-01

    Full Text Available Geomechanical model testing is an important method for studying the overall stability of high arch dams. The main task of a geomechanical model test is deformation monitoring. Currently, many types of deformation instruments are used for deformation monitoring of dam models, which provide valuable information on the deformation characteristics of the prototype dams. However, further investigation is required for assessing the overall stability of high arch dams through analyzing deformation monitoring data. First, a relationship for assessing the stability of dams is established based on the comprehensive model test method. Second, a stability evaluation system is presented based on the deformation monitoring data, together with the relationships between the deformation and overloading coefficient. Finally, the comprehensive model test method is applied to study the overall stability of the Jinping-I high arch dam. A three-dimensional destructive test of the geomechanical model dam is conducted under reinforced foundation conditions. The deformation characteristics and failure mechanisms of the dam abutments and foundation were investigated. The test results indicate that the stability safety factors of the dam abutments and foundation range from 5.2 to 6.0. These research results provide an important scientific insight into the design, construction, and operation stages of this project.

  16. Model Selection in Continuous Test Norming With GAMLSS.

    Science.gov (United States)

    Voncken, Lieke; Albers, Casper J; Timmerman, Marieke E

    2017-06-01

    To compute norms from reference group test scores, continuous norming is preferred over traditional norming. A suitable continuous norming approach for continuous data is the use of the Box-Cox Power Exponential model, which is found in the generalized additive models for location, scale, and shape. Applying the Box-Cox Power Exponential model for test norming requires model selection, but it is unknown how well this can be done with an automatic selection procedure. In a simulation study, we compared the performance of two stepwise model selection procedures combined with four model-fit criteria (Akaike information criterion, Bayesian information criterion, generalized Akaike information criterion (3), cross-validation), varying data complexity, sampling design, and sample size in a fully crossed design. The new procedure combined with one of the generalized Akaike information criterion was the most efficient model selection procedure (i.e., required the smallest sample size). The advocated model selection procedure is illustrated with norming data of an intelligence test.

  17. Modeling analysis of pulsed magnetization process of magnetic core based on inverse Jiles-Atherton model

    Science.gov (United States)

    Liu, Yi; Zhang, He; Liu, Siwei; Lin, Fuchang

    2018-05-01

    The J-A (Jiles-Atherton) model is widely used to describe the magnetization characteristics of magnetic cores in a low-frequency alternating field. However, this model is deficient in the quantitative analysis of the eddy current loss and residual loss in a high-frequency magnetic field. Based on the decomposition of magnetization intensity, an inverse J-A model is established which uses magnetic flux density B as an input variable. Static and dynamic core losses under high frequency excitation are separated based on the inverse J-A model. Optimized parameters of the inverse J-A model are obtained based on particle swarm optimization. The platform for the pulsed magnetization characteristic test is designed and constructed. The hysteresis curves of ferrite and Fe-based nanocrystalline cores at high magnetization rates are measured. The simulated and measured hysteresis curves are presented and compared. It is found that the inverse J-A model can be used to describe the magnetization characteristics at high magnetization rates and to separate the static loss and dynamic loss accurately.

  18. Refinement of protein termini in template-based modeling using conformational space annealing.

    Science.gov (United States)

    Park, Hahnbeom; Ko, Junsu; Joo, Keehyoung; Lee, Julian; Seok, Chaok; Lee, Jooyoung

    2011-09-01

    The rapid increase in the number of experimentally determined protein structures in recent years enables us to obtain more reliable protein tertiary structure models than ever by template-based modeling. However, refinement of template-based models beyond the limit available from the best templates is still needed for understanding protein function in atomic detail. In this work, we develop a new method for protein terminus modeling that can be applied to refinement of models with unreliable terminus structures. The energy function for terminus modeling consists of both physics-based and knowledge-based potential terms with carefully optimized relative weights. Effective sampling of both the framework and terminus is performed using the conformational space annealing technique. This method has been tested on a set of termini derived from a nonredundant structure database and two sets of termini from the CASP8 targets. The performance of the terminus modeling method is significantly improved over our previous method that does not employ terminus refinement. It is also comparable or superior to the best server methods tested in CASP8. The success of the current approach suggests that similar strategy may be applied to other types of refinement problems such as loop modeling or secondary structure rearrangement. Copyright © 2011 Wiley-Liss, Inc.

  19. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  20. Uplift mechanism for a shallow-buried structure in liquefiable sand subjected to seismic load: centrifuge model test and DEM modeling

    Science.gov (United States)

    Zhou, Jian; Wang, Zihan; Chen, Xiaoliang; Zhang, Jiao

    2014-06-01

    Based on a centrifuge model test and distinct element method (DEM), this study provides new insights into the uplift response of a shallow-buried structure and the liquefaction mechanism for saturated sand around the structure under seismic action. In the centrifuge test, a high-speed microscopic camera was installed in the structure model, by which the movements of particles around the structure were monitored. Then, a two-dimensional digital image processing technology was used to analyze the microstructure of saturated sand during the shaking event. Herein, a numerical simulation of the centrifuge experiment was conducted using a two-phase (solid and fluid) fully coupled distinct element code. This code incorporates a particle-fluid coupling model by means of a "fixed coarse-grid" fluid scheme in PFC3D (Particle Flow Code in Three Dimensions), with the modeling parameters partially calibrated based on earlier studies. The physical and numerical models both indicate the uplifts of the shallow-buried structure and the sharp rise in excess pore pressure. The corresponding micro-scale responses and explanations are provided. Overall, the uplift response of an underground structure and the occurrence of liquefaction in saturated sand are predicted successfully by DEM modeling. However, the dynamic responses during the shaking cannot be modeled accurately due to the restricted computer power.