WorldWideScience

Sample records for model rule-performance testing

  1. Testing agile requirements models

    Institute of Scientific and Technical Information of China (English)

    BOTASCHANJAN Jewgenij; PISTER Markus; RUMPE Bernhard

    2004-01-01

    This paper discusses a model-based approach to validate software requirements in agile development processes by simulation and in particular automated testing. The use of models as central development artifact needs to be added to the portfolio of software engineering techniques, to further increase efficiency and flexibility of the development beginning already early in the requirements definition phase. Testing requirements are some of the most important techniques to give feedback and to increase the quality of the result. Therefore testing of artifacts should be introduced as early as possible, even in the requirements definition phase.

  2. Wave Reflection Model Tests

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Larsen, Brian Juul

    The investigation concerns the design of a new internal breakwater in the main port of Ibiza. The objective of the model tests was in the first hand to optimize the cross section to make the wave reflection low enough to ensure that unacceptable wave agitation will not occur in the port. Secondly...

  3. Recent tests of realistic models

    Energy Technology Data Exchange (ETDEWEB)

    Brida, Giorgio; Degiovanni, Ivo Pietro; Genovese, Marco; Gramegna, Marco; Piacentini, Fabrizio; Schettini, Valentina; Traina, Paolo, E-mail: m.genovese@inrim.i [Istituto Nazionale di Ricerca Metrologica, Strada delle Cacce 91, 10135 Torino (Italy)

    2009-06-01

    In this article we present recent activity of our laboratories on testing specific hidden variable models and in particular we discuss the realizations of Alicki - van Ryn test and tests of SED and of Santos' models.

  4. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  5. Ship Model Testing

    Science.gov (United States)

    2016-01-15

    analyzer, dual fuel, material tester, universal tester, laser scanner and 3D printer 16. SECURITY CLASSIFICATION OF: a. REPORT b. ABSTRACT c...New Additions • New material testing machine with environmental chamber • New dual -fuel test bed for Haeberle Laboratory • Upgrade existing...of purchasing more data acquisition equipment (ie. FARO laser scanner, data telemetry , and velocity profiler). Table 1: Spending vs. budget

  6. Testing and validating environmental models

    Science.gov (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  7. Model testing of Wave Dragon

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    Previous to this project a scale model 1:50 of the wave energy converter (WEC) Wave Dragon was built by the Danish Maritime Institute and tested in a wave tank at Aalborg University (AAU). The test programs investigated the movements of the floating structure, mooring forces and forces in the reflectors. The first test was followed by test establishing the efficiency in different sea states. The scale model has also been extensively tested in the EU Joule Craft project JOR-CT98-7027 (Low-Pressure Turbine and Control Equipment for Wave Energy Converters /Wave Dragon) at University College Cork, Hydraulics and Maritime Research Centre, Ireland. The results of the previous model tests have formed the basis for a redesign of the WEC. In this project a reconstruction of the scale 1:50 model and sequential tests of changes to the model geometry and mass distribution parameters will be performed. AAU will make the modifications to the model based on the revised Loewenmark design and perform the tests in their wave tank. Grid connection requirements have been established. A hydro turbine with no movable parts besides the rotor has been developed and a scale model 1:3.5 tested, with a high efficiency over the whole head range. The turbine itself has possibilities for being used in river systems with low head and variable flow, an area of interest for many countries around the world. Finally, a regulation strategy for the turbines has been developed, which is essential for the future deployment of Wave Dragon.The video includes the following: 1. Title, 2. Introduction of the Wave Dragon, 3. Model test series H, Hs = 3 m, Rc = 3 m, 4. Model test series H, Hs = 5 m, Rc = 4 m, 5. Model test series I, Hs = 7 m, Rc = 1.25 m, 6. Model test series I, Hs = 7 m, Rc = 4 m, 7. Rolling title. On this VCD additional versions of the video can be found in the directory 'addvideo' for playing the video on PC's. These versions are: Model testing of Wave Dragon, DVD version

  8. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  9. Testing the model for testing competency.

    Science.gov (United States)

    Keating, Sarah B; Rutledge, Dana N; Sargent, Arlene; Walker, Polly

    2003-05-01

    The pilot study to demonstrate the utility of the CBRDM in the practice setting was successful. Using a matrix evaluation tool based on the model's competencies, evaluators were able to observe specific performance behaviors of senior nursing students and new graduates at either the novice or competent levels. The study faced the usual perils of pilot studies, including small sample size, a limited number of items from the total CBRDM, restricted financial resources, inexperienced researchers, unexpected barriers, and untested evaluation tools. It was understood from the beginning of the study that the research would be based on a program evaluation model, analyzing both processes and outcomes. However, the meager data findings led to the desire to continue to study use of the model for practice setting job expectations, career planning for nurses, and curriculum development for educators. Although the California Strategic Planning Committee for Nursing no longer has funding, we hope that others interested in role differentiation issues will take the results of this study and test the model in other practice settings. Its ability to measure higher levels of competency as well as novice and competent should be studied, i.e., proficient, expert, and advanced practice. The CBRDM may be useful in evaluating student and nurse performance, defining role expectations, and identifying the preparation necessary for the roles. The initial findings related to the two functions as leader and teacher in the care provider and care coordinator roles led to much discussion about helping students and nurses develop competence. Additional discussion focused on the roles as they apply to settings such as critical care or primary health care. The model is useful for all of nursing as it continues to define its levels of practice and their relationship to on-the-job performance, curriculum development, and career planning.

  10. Silo model tests with sand

    DEFF Research Database (Denmark)

    Munch-Andersen, Jørgen

    Tests have been carried out in a large silo model with Leighton Buzzard Sand. Normal pressures and shear stresses have been measured during tests carried out with inlet and outlet geometry. The filling method is a very important parameter for the strength of the mass and thereby the pressures...

  11. Silo model tests with sand

    DEFF Research Database (Denmark)

    Munch-Andersen, Jørgen

    Tests have been carried out in a large silo model with Leighton Buzzard Sand. Normal pressures and shear stresses have been measured during tests carried out with inlet and outlet geometry. The filling method is a very important parameter for the strength of the mass and thereby the pressures...... as well as the flow pattern during discharge of the silo. During discharge a mixed flow pattern has been identified...

  12. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    Mexico , March 1979. 14. Kinney, G. F.,.::. IeiN, .hoce 1h Ir, McMillan, p. 57, 1962. 15. Courant and Friedrichs, ,U: r. on moca an.: Jho...AD 79 275 NEW MEXICO UNIV ALBUGUERGUE ERIC H WANG CIVIL ENGINE-ETC F/6 18/3 COMPUTATIONAL MODELING OF SIMULATION TESTS.(U) JUN 80 6 LEIGH, W CHOWN, B...COMPUTATIONAL MODELING OF SIMULATION TESTS00 0G. Leigh W. Chown B. Harrison Eric H. Wang Civil Engineering Research Facility University of New Mexico

  13. Laboratory Tests of Chameleon Models

    CERN Document Server

    Brax, Philippe; Davis, Anne-Christine; Shaw, Douglas

    2009-01-01

    We present a cursory overview of chameleon models of dark energy and their laboratory tests with an emphasis on optical and Casimir experiments. Optical experiments measuring the ellipticity of an initially polarised laser beam are sensitive to the coupling of chameleons to photons. The next generation of Casimir experiments may be able to unravel the nature of the scalar force mediated by the chameleon between parallel plates.

  14. Proceedings Tenth Workshop on Model Based Testing

    OpenAIRE

    Pakulin, Nikolay; Petrenko, Alexander K.; Schlingloff, Bernd-Holger

    2015-01-01

    The workshop is devoted to model-based testing of both software and hardware. Model-based testing uses models describing the required behavior of the system under consideration to guide such efforts as test selection and test results evaluation. Testing validates the real system behavior against models and checks that the implementation conforms to them, but is capable also to find errors in the models themselves. The intent of this workshop is to bring together researchers and users of model...

  15. Remote control missile model test

    Science.gov (United States)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric body/tail fin data base was gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but can also be used as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analysis of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Comparisons between these data and calculations from the SWINT Euler code are also presented.

  16. Testing Strategies for Model-Based Development

    Science.gov (United States)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  17. Linear Logistic Test Modeling with R

    Directory of Open Access Journals (Sweden)

    Purya Baghaei

    2014-01-01

    Full Text Available The present paper gives a general introduction to the linear logistic test model (Fischer, 1973, an extension of the Rasch model with linear constraints on item parameters, along with eRm (an R package to estimate different types of Rasch models; Mair, Hatzinger, & Mair, 2014 functions to estimate the model and interpret its parameters. The applications of the model in test validation, hypothesis testing, cross-cultural studies of test bias, rule-based item generation, and investigating construct irrelevant factors which contribute to item difficulty are explained. The model is applied to an English as a foreign language reading comprehension test and the results are discussed.

  18. Testing Linear Models for Ability Parameters in Item Response Models

    NARCIS (Netherlands)

    Glas, Cees A.W.; Hendrawan, Irene

    2005-01-01

    Methods for testing hypotheses concerning the regression parameters in linear models for the latent person parameters in item response models are presented. Three tests are outlined: A likelihood ratio test, a Lagrange multiplier test and a Wald test. The tests are derived in a marginal maximum like

  19. Testing linearity against nonlinear moving average models

    NARCIS (Netherlands)

    de Gooijer, J.G.; Brännäs, K.; Teräsvirta, T.

    1998-01-01

    Lagrange multiplier (LM) test statistics are derived for testing a linear moving average model against an additive smooth transition moving average model. The latter model is introduced in the paper. The small sample performance of the proposed tests are evaluated in a Monte Carlo study and compared

  20. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  1. Vehicle rollover sensor test modeling

    NARCIS (Netherlands)

    McCoy, R.W.; Chou, C.C.; Velde, R. van de; Twisk, D.; Schie, C. van

    2007-01-01

    A computational model of a mid-size sport utility vehicle was developed using MADYMO. The model includes a detailed description of the suspension system and tire characteristics that incorporated the Delft-Tyre magic formula description. The model was correlated by simulating a vehicle suspension ki

  2. Propfan test assessment testbed aircraft flutter model test report

    Science.gov (United States)

    Jenness, C. M. J.

    1987-01-01

    The PropFan Test Assessment (PTA) program includes flight tests of a propfan power plant mounted on the left wind of a modified Gulfstream II testbed aircraft. A static balance boom is mounted on the right wing tip for lateral balance. Flutter analyses indicate that these installations reduce the wing flutter stabilizing speed and that torsional stiffening and the installation of a flutter stabilizing tip boom are required on the left wing for adequate flutter safety margins. Wind tunnel tests of a 1/9th scale high speed flutter model of the testbed aircraft were conducted. The test program included the design, fabrication, and testing of the flutter model and the correlation of the flutter test data with analysis results. Excellent correlations with the test data were achieved in posttest flutter analysis using actual model properties. It was concluded that the flutter analysis method used was capable of accurate flutter predictions for both the (symmetric) twin propfan configuration and the (unsymmetric) single propfan configuration. The flutter analysis also revealed that the differences between the tested model configurations and the current aircraft design caused the (scaled) model flutter speed to be significantly higher than that of the aircraft, at least for the single propfan configuration without a flutter boom. Verification of the aircraft final design should, therefore, be based on flutter predictions made with the test validated analysis methods.

  3. Testing of constitutive models in LAME.

    Energy Technology Data Exchange (ETDEWEB)

    Hammerand, Daniel Carl; Scherzinger, William Mark

    2007-09-01

    Constitutive models for computational solid mechanics codes are in LAME--the Library of Advanced Materials for Engineering. These models describe complex material behavior and are used in our finite deformation solid mechanics codes. To ensure the correct implementation of these models, regression tests have been created for constitutive models in LAME. A selection of these tests is documented here. Constitutive models are an important part of any solid mechanics code. If an analysis code is meant to provide accurate results, the constitutive models that describe the material behavior need to be implemented correctly. Ensuring the correct implementation of constitutive models is the goal of a testing procedure that is used with the Library of Advanced Materials for Engineering (LAME) (see [1] and [2]). A test suite for constitutive models can serve three purposes. First, the test problems provide the constitutive model developer a means to test the model implementation. This is an activity that is always done by any responsible constitutive model developer. Retaining the test problem in a repository where the problem can be run periodically is an excellent means of ensuring that the model continues to behave correctly. A second purpose of a test suite for constitutive models is that it gives application code developers confidence that the constitutive models work correctly. This is extremely important since any analyst that uses an application code for an engineering analysis will associate a constitutive model in LAME with the application code, not LAME. Therefore, ensuring the correct implementation of constitutive models is essential for application code teams. A third purpose of a constitutive model test suite is that it provides analysts with example problems that they can look at to understand the behavior of a specific model. Since the choice of a constitutive model, and the properties that are used in that model, have an enormous effect on the results of an

  4. GEOCHEMICAL TESTING AND MODEL DEVELOPMENT - RESIDUAL TANK WASTE TEST PLAN

    Energy Technology Data Exchange (ETDEWEB)

    CANTRELL KJ; CONNELLY MP

    2010-03-09

    This Test Plan describes the testing and chemical analyses release rate studies on tank residual samples collected following the retrieval of waste from the tank. This work will provide the data required to develop a contaminant release model for the tank residuals from both sludge and salt cake single-shell tanks. The data are intended for use in the long-term performance assessment and conceptual model development.

  5. Hydraulic Model Tests on Modified Wave Dragon

    DEFF Research Database (Denmark)

    Hald, Tue; Lynggaard, Jakob

    A floating model of the Wave Dragon (WD) was built in autumn 1998 by the Danish Maritime Institute in scale 1:50, see Sørensen and Friis-Madsen (1999) for reference. This model was subjected to a series of model tests and subsequent modifications at Aalborg University and in the following...... are found in Hald and Lynggaard (2001). Model tests and reconstruction are carried out during the phase 3 project: ”Wave Dragon. Reconstruction of an existing model in scale 1:50 and sequentiel tests of changes to the model geometry and mass distribution parameters” sponsored by the Danish Energy Agency...

  6. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  7. Used Fuel Testing Transportation Model

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Steven B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Best, Ralph E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Maheras, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jensen, Philip J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); England, Jeffery L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); LeDuc, Dan [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-09-25

    This report identifies shipping packages/casks that might be used by the Used Nuclear Fuel Disposition Campaign Program (UFDC) to ship fuel rods and pieces of fuel rods taken from high-burnup used nuclear fuel (UNF) assemblies to and between research facilities for purposes of evaluation and testing. Also identified are the actions that would need to be taken, if any, to obtain U.S. Nuclear Regulatory (NRC) or other regulatory authority approval to use each of the packages and/or shipping casks for this purpose.

  8. Used Fuel Testing Transportation Model

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Steven B.; Best, Ralph E.; Maheras, Steven J.; Jensen, Philip J.; England, Jeffery L.; LeDuc, Dan

    2014-09-24

    This report identifies shipping packages/casks that might be used by the Used Nuclear Fuel Disposition Campaign Program (UFDC) to ship fuel rods and pieces of fuel rods taken from high-burnup used nuclear fuel (UNF) assemblies to and between research facilities for purposes of evaluation and testing. Also identified are the actions that would need to be taken, if any, to obtain U.S. Nuclear Regulatory (NRC) or other regulatory authority approval to use each of the packages and/or shipping casks for this purpose.

  9. Statistical Tests for Mixed Linear Models

    CERN Document Server

    Khuri, André I; Sinha, Bimal K

    2011-01-01

    An advanced discussion of linear models with mixed or random effects. In recent years a breakthrough has occurred in our ability to draw inferences from exact and optimum tests of variance component models, generating much research activity that relies on linear models with mixed and random effects. This volume covers the most important research of the past decade as well as the latest developments in hypothesis testing. It compiles all currently available results in the area of exact and optimum tests for variance component models and offers the only comprehensive treatment for these models a

  10. Colour Reconnection - Models and Tests

    CERN Document Server

    Christiansen, Jesper R

    2015-01-01

    Recent progress on colour reconnection within the Pythia framework is presented. A new model is introduced, based on the SU(3) structure of QCD and a minimization of the potential string energy. The inclusion of the epsilon structure of SU(3) gives a new baryon production mechanism and makes it possible simultaneously to describe hyperon production at both $e^+e^-$ and pp colliders. Finally, predictions for $e^+e^-$ colliders, both past and potential future ones, are presented.

  11. Model Based Testing for Agent Systems

    Science.gov (United States)

    Zhang, Zhiyong; Thangarajah, John; Padgham, Lin

    Although agent technology is gaining world wide popularity, a hindrance to its uptake is the lack of proper testing mechanisms for agent based systems. While many traditional software testing methods can be generalized to agent systems, there are many aspects that are different and which require an understanding of the underlying agent paradigm. In this paper we present certain aspects of a testing framework that we have developed for agent based systems. The testing framework is a model based approach using the design models of the Prometheus agent development methodology. In this paper we focus on model based unit testing and identify the appropriate units, present mechanisms for generating suitable test cases and for determining the order in which the units are to be tested, present a brief overview of the unit testing process and an example. Although we use the design artefacts from Prometheus the approach is suitable for any plan and event based agent system.

  12. TESTING MONETARY EXCHANGE RATE MODELS WITH PANEL COINTEGRATION TESTS

    Directory of Open Access Journals (Sweden)

    Szabo Andrea

    2015-07-01

    Full Text Available The monetary exchange rate models explain the long run behaviour of the nominal exchange rate. Their central assertion is that there is a long run equilibrium relationship between the nominal exchange rate and monetary macro-fundamentals. Although these models are essential tools of international macroeconomics, their empirical validity is ambiguous. Previously, time series testing was prevalent in the literature, but it did not bring convincing results. The power of the unit root and the cointegration tests are too low to reject the null hypothesis of no cointegration between the variables. This power can be enhanced by arranging our data in a panel data set, which allows us to analyse several time series simultaneously and enables us to increase the number of observations. We conducted a weak empirical test of the monetary exchange rate models by testing the existence of cointegration between the variables in three panels. We investigated 6, 10 and 15 OECD countries during the following periods: 1976Q1-2011Q4, 1985Q1-2011Q4 and 1996Q1-2011Q4. We tested the reduced form of the monetary exchange rate models in three specifications; we have two restricted models and an unrestricted model. Since cointegration can only be interpreted among non-stationary processes, we investigate the order of the integration of our variables with IPS, Fisher-ADF, Fisher-PP panel unit root tests and the Hadri panel stationary test. All the variables can be unit root processes; therefore we analyze the cointegration with the Pedroni and Kao panel cointegration test. The restricted models performed better than the unrestricted one and we obtained the best results with the 1985Q1-2011Q4 panel. The Kao test rejects the null hypotheses – there is no cointegration between the variables – in all the specifications and all the panels, but the Pedroni test does not show such a positive picture. Hence we found only moderate support for the monetary exchange rate models.

  13. Linear Logistic Test Modeling with R

    Science.gov (United States)

    Baghaei, Purya; Kubinger, Klaus D.

    2015-01-01

    The present paper gives a general introduction to the linear logistic test model (Fischer, 1973), an extension of the Rasch model with linear constraints on item parameters, along with eRm (an R package to estimate different types of Rasch models; Mair, Hatzinger, & Mair, 2014) functions to estimate the model and interpret its parameters. The…

  14. Cost Modeling for SOC Modules Testing

    Directory of Open Access Journals (Sweden)

    Balwinder Singh

    2013-08-01

    Full Text Available The complexity of the system design is increasing very rapidly as the number of transistors on Integrated Circuits (IC doubles as per Moore’s law.There is big challenge of testing this complex VLSI circuit, in which whole system is integrated into a single chip called System on Chip (SOC. Cost of testing the SOC is also increasing with complexity. Cost modeling plays a vital role in reduction of test cost and time to market. This paper includes the cost modeling of the SOC Module testing which contains both analog and digital modules. The various test cost parameters and equations are considered from the previous work. The mathematical relations are developed for cost modeling to test the SOC further cost modeling equations are modeled in Graphical User Interface (GUI in MATLAB, which can be used as a cost estimation tool. A case study is done to calculate the cost of the SOC testing due to Logic Built in Self Test (LBIST and Memory Built in Self Test (MBIST. VLSI Test engineers can take the benefits of such cost estimation tools for test planning.

  15. Biglan Model Test Based on Institutional Diversity.

    Science.gov (United States)

    Roskens, Ronald W.; Creswell, John W.

    The Biglan model, a theoretical framework for empirically examining the differences among subject areas, classifies according to three dimensions: adherence to common set of paradigms (hard or soft), application orientation (pure or applied), and emphasis on living systems (life or nonlife). Tests of the model are reviewed, and a further test is…

  16. Graphical Models and Computerized Adaptive Testing.

    Science.gov (United States)

    Mislevy, Robert J.; Almond, Russell G.

    This paper synthesizes ideas from the fields of graphical modeling and education testing, particularly item response theory (IRT) applied to computerized adaptive testing (CAT). Graphical modeling can offer IRT a language for describing multifaceted skills and knowledge, and disentangling evidence from complex performances. IRT-CAT can offer…

  17. Multivariate Model for Test Response Analysis

    NARCIS (Netherlands)

    Krishnan, Shaji; Krishnan, Shaji; Kerkhoff, Hans G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage

  18. Test-driven modeling of embedded systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2015-01-01

    To benefit maximally from model-based systems engineering (MBSE) trustworthy high quality models are required. From the software disciplines it is known that test-driven development (TDD) can significantly increase the quality of the products. Using a test-driven approach with MBSE may have a sim...

  19. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  20. Port Adriano, 2D-Model Tests

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Andersen, Thomas Lykke; Jensen, Palle Meinert

    This report present the results of 2D physical model tests (length scale 1:50) carried out in a waveflume at Dept. of Civil Engineering, Aalborg University (AAU).......This report present the results of 2D physical model tests (length scale 1:50) carried out in a waveflume at Dept. of Civil Engineering, Aalborg University (AAU)....

  1. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  2. Testing Models for Structure Formation

    CERN Document Server

    Kaiser, N

    1993-01-01

    I review a number of tests of theories for structure formation. Large-scale flows and IRAS galaxies indicate a high density parameter $\\Omega \\simeq 1$, in accord with inflationary predictions, but it is not clear how this meshes with the uniformly low values obtained from virial analysis on scales $\\sim$ 1Mpc. Gravitational distortion of faint galaxies behind clusters allows one to construct maps of the mass surface density, and this should shed some light on the large vs small-scale $\\Omega$ discrepancy. Power spectrum analysis reveals too red a spectrum (compared to standard CDM) on scales $\\lambda \\sim 10-100$ $h^{-1}$Mpc, but the gaussian fluctuation hypothesis appears to be in good shape. These results suggest that the problem for CDM lies not in the very early universe --- the inflationary predictions of $\\Omega = 1$ and gaussianity both seem to be OK; furthermore, the COBE result severely restricts modifications such as tilting the primordial spectrum --- but in the assumed matter content. The power s...

  3. The Couplex test cases: models and lessons

    Energy Technology Data Exchange (ETDEWEB)

    Bourgeat, A. [Lyon-1 Univ., MCS, 69 - Villeurbanne (France); Kern, M. [Institut National de Recherches Agronomiques (INRA), 78 - Le Chesnay (France); Schumacher, S.; Talandier, J. [Agence Nationale pour la Gestion des Dechets Radioactifs (ANDRA), 92 - Chatenay Malabry (France)

    2003-07-01

    The Couplex test cases are a set of numerical test models for nuclear waste deep geological disposal simulation. They are centered around the numerical issues arising in the near and far field transport simulation. They were used in an international contest, and are now becoming a reference in the field. We present the models used in these test cases, and show sample results from the award winning teams. (authors)

  4. FABASOFT BEST PRACTICES AND TEST METRICS MODEL

    Directory of Open Access Journals (Sweden)

    Nadica Hrgarek

    2007-06-01

    Full Text Available Software companies have to face serious problems about how to measure the progress of test activities and quality of software products in order to estimate test completion criteria, and if the shipment milestone will be reached on time. Measurement is a key activity in testing life cycle and requires established, managed and well documented test process, defined software quality attributes, quantitative measures, and using of test management and bug tracking tools. Test metrics are a subset of software metrics (product metrics, process metrics and enable the measurement and quality improvement of test process and/or software product. The goal of this paper is to briefly present Fabasoft best practices and lessons learned during functional and system testing of big complex software products, and to describe a simple test metrics model applied to the software test process with the purpose to better control software projects, measure and increase software quality.

  5. The Vanishing Tetrad Test: Another Test of Model Misspecification

    Science.gov (United States)

    Roos, J. Micah

    2014-01-01

    The Vanishing Tetrad Test (VTT) (Bollen, Lennox, & Dahly, 2009; Bollen & Ting, 2000; Hipp, Bauer, & Bollen, 2005) is an extension of the Confirmatory Tetrad Analysis (CTA) proposed by Bollen and Ting (Bollen & Ting, 1993). VTT is a powerful tool for detecting model misspecification and can be particularly useful in cases in which…

  6. The Vanishing Tetrad Test: Another Test of Model Misspecification

    Science.gov (United States)

    Roos, J. Micah

    2014-01-01

    The Vanishing Tetrad Test (VTT) (Bollen, Lennox, & Dahly, 2009; Bollen & Ting, 2000; Hipp, Bauer, & Bollen, 2005) is an extension of the Confirmatory Tetrad Analysis (CTA) proposed by Bollen and Ting (Bollen & Ting, 1993). VTT is a powerful tool for detecting model misspecification and can be particularly useful in cases in which…

  7. Cost Modeling for SOC Modules Testing

    OpenAIRE

    Balwinder Singh; Arun Khosla; Sukhleen B. Narang

    2013-01-01

    The complexity of the system design is increasing very rapidly as the number of transistors on Integrated Circuits (IC) doubles as per Moore’s law.There is big challenge of testing this complex VLSI circuit, in which whole system is integrated into a single chip called System on Chip (SOC). Cost of testing the SOC is also increasing with complexity. Cost modeling plays a vital role in reduction of test cost and time to market. This paper includes the cost modeling of the SOC Module testing...

  8. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  9. Do Test Design and Uses Influence Test Preparation? Testing a Model of Washback with Structural Equation Modeling

    Science.gov (United States)

    Xie, Qin; Andrews, Stephen

    2013-01-01

    This study introduces Expectancy-value motivation theory to explain the paths of influences from perceptions of test design and uses to test preparation as a special case of washback on learning. Based on this theory, two conceptual models were proposed and tested via Structural Equation Modeling. Data collection involved over 870 test takers of…

  10. Do Test Design and Uses Influence Test Preparation? Testing a Model of Washback with Structural Equation Modeling

    Science.gov (United States)

    Xie, Qin; Andrews, Stephen

    2013-01-01

    This study introduces Expectancy-value motivation theory to explain the paths of influences from perceptions of test design and uses to test preparation as a special case of washback on learning. Based on this theory, two conceptual models were proposed and tested via Structural Equation Modeling. Data collection involved over 870 test takers of…

  11. Standardized Tests and Froebel's Original Kindergarten Model

    Science.gov (United States)

    Jeynes, William H.

    2006-01-01

    The author argues that American educators rely on standardized tests at too early an age when administered in kindergarten, particularly given the original intent of kindergarten as envisioned by its founder, Friedrich Froebel. The author examines the current use of standardized tests in kindergarten and the Froebel model, including his emphasis…

  12. Horns Rev II, 2-D Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Frigaard, Peter

    This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU), on behalf of Energy E2 A/S part of DONG Energy A/S, Denmark. The objective of the tests was: to investigate the combined influence of the pile...

  13. Sample Size Determination for Rasch Model Tests

    Science.gov (United States)

    Draxler, Clemens

    2010-01-01

    This paper is concerned with supplementing statistical tests for the Rasch model so that additionally to the probability of the error of the first kind (Type I probability) the probability of the error of the second kind (Type II probability) can be controlled at a predetermined level by basing the test on the appropriate number of observations.…

  14. Modelling and Testing of Friction in Forging

    DEFF Research Database (Denmark)

    Bay, Niels

    2007-01-01

    Knowledge about friction is still limited in forging. The theoretical models applied presently for process analysis are not satisfactory compared to the advanced and detailed studies possible to carry out by plastic FEM analyses and more refined models have to be based on experimental testing...

  15. Testing inequality constrained hypotheses in SEM Models

    NARCIS (Netherlands)

    Van de Schoot, R.; Hoijtink, H.J.A.; Dekovic, M.

    2010-01-01

    Researchers often have expectations that can be expressed in the form of inequality constraints among the parameters of a structural equation model. It is currently not possible to test these so-called informative hypotheses in structural equation modeling software. We offer a solution to this probl

  16. Modeling Answer Changes on Test Items

    Science.gov (United States)

    van der Linden, Wim J.; Jeon, Minjeong

    2012-01-01

    The probability of test takers changing answers upon review of their initial choices is modeled. The primary purpose of the model is to check erasures on answer sheets recorded by an optical scanner for numbers and patterns that may be indicative of irregular behavior, such as teachers or school administrators changing answer sheets after their…

  17. Modeling Nonignorable Missing Data in Speeded Tests

    Science.gov (United States)

    Glas, Cees A. W.; Pimentel, Jonald L.

    2008-01-01

    In tests with time limits, items at the end are often not reached. Usually, the pattern of missing responses depends on the ability level of the respondents; therefore, missing data are not ignorable in statistical inference. This study models data using a combination of two item response theory (IRT) models: one for the observed response data and…

  18. Improved testing inference in mixed linear models

    CERN Document Server

    Melo, Tatiane F N; Cribari-Neto, Francisco; 10.1016/j.csda.2008.12.007

    2011-01-01

    Mixed linear models are commonly used in repeated measures studies. They account for the dependence amongst observations obtained from the same experimental unit. Oftentimes, the number of observations is small, and it is thus important to use inference strategies that incorporate small sample corrections. In this paper, we develop modified versions of the likelihood ratio test for fixed effects inference in mixed linear models. In particular, we derive a Bartlett correction to such a test and also to a test obtained from a modified profile likelihood function. Our results generalize those in Zucker et al. (Journal of the Royal Statistical Society B, 2000, 62, 827-838) by allowing the parameter of interest to be vector-valued. Additionally, our Bartlett corrections allow for random effects nonlinear covariance matrix structure. We report numerical evidence which shows that the proposed tests display superior finite sample behavior relative to the standard likelihood ratio test. An application is also presente...

  19. Graded CTL Model Checking for Test Generation

    CERN Document Server

    Napoli, Margherita

    2011-01-01

    Recently there has been a great attention from the scientific community towards the use of the model-checking technique as a tool for test generation in the simulation field. This paper aims to provide a useful mean to get more insights along these lines. By applying recent results in the field of graded temporal logics, we present a new efficient model-checking algorithm for Hierarchical Finite State Machines (HSM), a well established symbolism long and widely used for representing hierarchical models of discrete systems. Performing model-checking against specifications expressed using graded temporal logics has the peculiarity of returning more counterexamples within a unique run. We think that this can greatly improve the efficacy of automatically getting test cases. In particular we verify two different models of HSM against branching time temporal properties.

  20. A 'Turing' Test for Landscape Evolution Models

    Science.gov (United States)

    Parsons, A. J.; Wise, S. M.; Wainwright, J.; Swift, D. A.

    2008-12-01

    Resolving the interactions among tectonics, climate and surface processes at long timescales has benefited from the development of computer models of landscape evolution. However, testing these Landscape Evolution Models (LEMs) has been piecemeal and partial. We argue that a more systematic approach is required. What is needed is a test that will establish how 'realistic' an LEM is and thus the extent to which its predictions may be trusted. We propose a test based upon the Turing Test of artificial intelligence as a way forward. In 1950 Alan Turing posed the question of whether a machine could think. Rather than attempt to address the question directly he proposed a test in which an interrogator asked questions of a person and a machine, with no means of telling which was which. If the machine's answer could not be distinguished from those of the human, the machine could be said to demonstrate artificial intelligence. By analogy, if an LEM cannot be distinguished from a real landscape it can be deemed to be realistic. The Turing test of intelligence is a test of the way in which a computer behaves. The analogy in the case of an LEM is that it should show realistic behaviour in terms of form and process, both at a given moment in time (punctual) and in the way both form and process evolve over time (dynamic). For some of these behaviours, tests already exist. For example there are numerous morphometric tests of punctual form and measurements of punctual process. The test discussed in this paper provides new ways of assessing dynamic behaviour of an LEM over realistically long timescales. However challenges remain in developing an appropriate suite of challenging tests, in applying these tests to current LEMs and in developing LEMs that pass them.

  1. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through suffi

  2. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through

  3. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet...... and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...... and implementing abstractions will improve the applicability of model checking in practice....

  4. Unit testing, model validation, and biological simulation

    Science.gov (United States)

    Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models. PMID:27635225

  5. A Specification Test of Stochastic Diffusion Models

    Institute of Scientific and Technical Information of China (English)

    Shu-lin ZHANG; Zheng-hong WEI; Qiu-xiang BI

    2013-01-01

    In this paper,we propose a hypothesis testing approach to checking model mis-specification in continuous-time stochastic diffusion model.The key idea behind the development of our test statistic is rooted in the generalized information equality in the context of martingale estimating equations.We propose a bootstrap resampling method to implement numerically the proposed diagnostic procedure.Through intensive simulation studies,we show that our approach is well performed in the aspects of type Ⅰ error control,power improvement as well as computational efficiency.

  6. Testing cosmological models with COBE data

    Energy Technology Data Exchange (ETDEWEB)

    Torres, S. [Observatorio Astronomico, Bogota` (Colombia)]|[Centro Internacional de Fisica, Bogota` (Colombia); Cayon, L. [Lawrence Berkeley Laboratory and Center for Particle Astrophysics, Berkeley (United States); Martinez-Gonzalez, E.; Sanz, J. L. [Santander, Univ. de Cantabria (Spain). Instituto de Fisica. Consejo Superior de Investigaciones Cientificas

    1997-02-01

    The authors test cosmological models with {Omega} < 1 using the COBE two-year cross-correlation function by means of a maximum-likelihood test with Monte Carlo realizations of several {Omega} models. Assuming a Harrison-Zel`dovich primordial power spectrum with amplitude {proportional_to} Q, it is found that there is a large region in the ({Omega}, Q), parameter space that fits the data equally well. They find that the flatness of the universe is not implied by the data. A summary of other analyses of COBE data to constrain the shape of the primordial spectrum is presented.

  7. Design, modeling and testing of data converters

    CERN Document Server

    Kiaei, Sayfe; Xu, Fang

    2014-01-01

    This book presents the a scientific discussion of the state-of-the-art techniques and designs for modeling, testing and for the performance analysis of data converters. The focus is put on sustainable data conversion. Sustainability has become a public issue that industries and users can not ignore. Devising environmentally friendly solutions for data conversion designing, modeling and testing is nowadays a requirement that researchers and practitioners must consider in their activities. This book presents the outcome of the IWADC workshop 2011, held in Orvieto, Italy.

  8. Experimental Concepts for Testing Seismic Hazard Models

    Science.gov (United States)

    Marzocchi, W.; Jordan, T. H.

    2015-12-01

    Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.

  9. [Thurstone model application to difference sensory tests].

    Science.gov (United States)

    Angulo, Ofelia; O'Mahony, Michael

    2009-12-01

    Part of understanding why judges perform better on some difference tests than others requires an understanding of how information coming from the mouth to the brain is processed. For some tests it is processed more efficiently than others. This is described by what has been called Thurstonian modeling. This brief review introduces the concepts and ideas involved in Thurstonian modeling as applied to sensory difference measurement. It summarizes the literature concerned with the theorizing and confirmation of Thurstonian models. It introduces the important concept of stimulus variability and the fundamental measure of sensory difference: d'. It indicates how the paradox of discriminatory non-discriminators, which had puzzled researchers for years, can be simply explained using the model. It considers how memory effects and the complex interactions in the mouth can reduce d' by increasing the variance of sensory distributions.

  10. Electroweak tests of the Standard Model

    CERN Document Server

    Erler, Jens

    2012-01-01

    Electroweak precision tests of the Standard Model of the fundamental interactions are reviewed ranging from the lowest to the highest energy experiments. Results from global fits are presented with particular emphasis on the extraction of fundamental parameters such as the Fermi constant, the strong coupling constant, the electroweak mixing angle, and the mass of the Higgs boson. Constraints on physics beyond the Standard Model are also discussed.

  11. Tests of the Electroweak Standard Model

    CERN Document Server

    Erler, Jens

    2012-01-01

    Electroweak precision tests of the Standard Model of the fundamental interactions are reviewed ranging from the lowest to the highest energy experiments. Results from global fits are presented with particular emphasis on the extraction of fundamental parameters such as the Fermi constant, the strong coupling constant, the electroweak mixing angle, and the mass of the Higgs boson. Constraints on physics beyond the Standard Model are also discussed.

  12. Testing mechanistic models of growth in insects

    OpenAIRE

    Maino, James L.; Kearney, Michael R.

    2015-01-01

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compare...

  13. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  14. Modeling nonignorable missing data in speeded tests

    NARCIS (Netherlands)

    Glas, Cees A.W.; Pimentel, Jonald L.

    2008-01-01

    In tests with time limits, items at the end are often not reached. Usually, the pattern of missing responses depends on the ability level of the respondents; therefore, missing data are not ignorable in statistical inference. This study models data using a combination of two item response theory (IR

  15. Mechanism test bed. Flexible body model report

    Science.gov (United States)

    Compton, Jimmy

    1991-01-01

    The Space Station Mechanism Test Bed is a six degree-of-freedom motion simulation facility used to evaluate docking and berthing hardware mechanisms. A generalized rigid body math model was developed which allowed the computation of vehicle relative motion in six DOF due to forces and moments from mechanism contact, attitude control systems, and gravity. No vehicle size limitations were imposed in the model. The equations of motion were based on Hill's equations for translational motion with respect to a nominal circular earth orbit and Newton-Euler equations for rotational motion. This rigid body model and supporting software were being refined.

  16. Testing models of tree canopy structure

    Energy Technology Data Exchange (ETDEWEB)

    Martens, S.N. (Los Alamos National Laboratory, NM (United States))

    1994-06-01

    Models of tree canopy structure are difficult to test because of a lack of data which are suitability detailed. Previously, I have made three-dimensional reconstructions of individual trees from measured data. These reconstructions have been used to test assumptions about the dispersion of canopy elements in two- and three-dimensional space. Lacunarity analysis has also been used to describe the texture of the reconstructed canopies. Further tests regarding models of the nature of tree branching structures have been made. Results using probability distribution functions for branching measured from real trees show that branching in Juglans is not Markovian. Specific constraints or rules are necessary to achieve simulations of branching structure which are faithful to the originally measured trees.

  17. Testing mechanistic models of growth in insects.

    Science.gov (United States)

    Maino, James L; Kearney, Michael R

    2015-11-22

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes.

  18. Interpretation of test data with dynamic modeling

    Energy Technology Data Exchange (ETDEWEB)

    Biba, P. [Southern California Edison, San Clemente, CA (United States). San Onofre Nuclear Generating Station

    1999-11-01

    The in-service testing of many Important-to-safety components, such as valves, pumps, etc. is often performed while the plant is either shut-down or the particular system is in a test mode. Thus the test conditions may be different from the actual operating conditions under which the components would be required to operate. In addition, the components must function under various postulated accident scenarios, which can not be duplicated during plant normal operation. This paper deals with the method of interpretation of the test data by a dynamic model, which allows the evaluation of the many factors affecting the system performance, in order to assure component and system operability.

  19. Testing Parametric versus Semiparametric Modelling in Generalized Linear Models

    NARCIS (Netherlands)

    Härdle, W.K.; Mammen, E.; Müller, M.D.

    1996-01-01

    We consider a generalized partially linear model E(Y|X,T) = G{X'b + m(T)} where G is a known function, b is an unknown parameter vector, and m is an unknown function.The paper introduces a test statistic which allows to decide between a parametric and a semiparametric model: (i) m is linear, i.e. m(

  20. Parametric Testing of Launch Vehicle FDDR Models

    Science.gov (United States)

    Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar

    2011-01-01

    For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.

  1. A Method to Test Model Calibration Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-08-26

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  2. Temperature Buffer Test. Final THM modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan [Clay Technology AB, Lund (Sweden); Ledesma, Alberto; Jacinto, Abel [UPC, Universitat Politecnica de Catalunya, Barcelona (Spain)

    2012-01-15

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code{sub B}right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code{sub B}right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  3. A Blind Test of Hapke's Photometric Model

    Science.gov (United States)

    Helfenstein, P.; Shepard, M. K.

    2003-01-01

    Hapke's bidirectional reflectance equation is a versatile analytical tool for predicting (i.e. forward modeling) the photometric behavior of a particulate surface from the observed optical and structural properties of its constituents. Remote sensing applications of Hapke s model, however, generally seek to predict the optical and structural properties of particulate soil constituents from the observed photometric behavior of a planetary surface (i.e. inverse-modeling). Our confidence in the latter approach can be established only if we ruthlessly test and optimize it. Here, we summarize preliminary results from a blind-test of the Hapke model using laboratory measurements obtained with the Bloomsburg University Goniometer (B.U.G.). The first author selected eleven well-characterized powder samples and measured the spectrophotometric behavior of each. A subset of twenty undisclosed examples of the photometric measurement sets were sent to the second author who fit the data using the Hapke model and attempted to interpret their optical and mechanical properties from photometry alone.

  4. Testing the Correlated Random Coefficient Model*

    Science.gov (United States)

    Heckman, James J.; Schmierer, Daniel; Urzua, Sergio

    2010-01-01

    The recent literature on instrumental variables (IV) features models in which agents sort into treatment status on the basis of gains from treatment as well as on baseline-pretreatment levels. Components of the gains known to the agents and acted on by them may not be known by the observing economist. Such models are called correlated random coe cient models. Sorting on unobserved components of gains complicates the interpretation of what IV estimates. This paper examines testable implications of the hypothesis that agents do not sort into treatment based on gains. In it, we develop new tests to gauge the empirical relevance of the correlated random coe cient model to examine whether the additional complications associated with it are required. We examine the power of the proposed tests. We derive a new representation of the variance of the instrumental variable estimator for the correlated random coefficient model. We apply the methods in this paper to the prototypical empirical problem of estimating the return to schooling and nd evidence of sorting into schooling based on unobserved components of gains. PMID:21057649

  5. Temperature Buffer Test. Final THM modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan [Clay Technology AB, Lund (Sweden); Ledesma, Alberto; Jacinto, Abel [UPC, Universitat Politecnica de Catalunya, Barcelona (Spain)

    2012-01-15

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code{sub B}right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code{sub B}right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  6. Physical model tests for floating wind turbines

    DEFF Research Database (Denmark)

    Bredmose, Henrik; Mikkelsen, Robert Flemming; Borg, Michael

    Floating offshore wind turbines are relevant at sites where the depth is too large for the installation of a bottom fixed substructure. While 3200 bottom fixed offshore turbines has been installed in Europe (EWEA 2016), only a handful of floating wind turbines exist worldwide and it is still...... an open question which floater concept is the most economically feasible. The design of the floaters for the floating turbines relies heavily on numerical modelling. While several coupled models exist, data sets for their validation are scarce. Validation, however, is important since the turbine behaviour...... is complex due to the combined actions of aero- and hydrodynamic loads, mooring loads and blade pitch control. The present talk outlines two recent test campaigns with a floating wind turbine in waves and wind. Two floater were tested, a compact TLP floater designed at DTU (Bredmose et al 2015, Pegalajar...

  7. 2-D Model Test of Dolosse Breakwater

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Liu, Zhou

    1994-01-01

    The rational design diagram for Dolos armour should incorporate both the hydraulic stability and the structural integrity. The previous tests performed by Aalborg University (AU) made available such design diagram for the trunk of Dolos breakwater without superstructures (Burcharth et al. 1992......). To extend the design diagram to cover Dolos breakwaters with superstructure, 2-D model tests of Dolos breakwater with wave wall is included in the project Rubble Mound Breakwater Failure Modes sponsored by the Directorate General XII of the Commission of the European Communities under Contract MAS-CT92...... was on the Dolos breakwater with a high superstructure, where there was almost no overtopping. This case is believed to be the most dangerous one. The test of the Dolos breakwater with a low superstructure was also performed. The objective of the last part of the experiment is to investigate the influence...

  8. Damage modeling in Small Punch Test specimens

    DEFF Research Database (Denmark)

    Martínez Pañeda, Emilio; Cuesta, I.I.; Peñuelas, I.

    2016-01-01

    Ductile damage modeling within the Small Punch Test (SPT) is extensively investigated. The capabilities ofthe SPT to reliably estimate fracture and damage properties are thoroughly discussed and emphasis isplaced on the use of notched specimens. First, different notch profiles are analyzed...... and constraint conditionsquantified. The role of the notch shape is comprehensively examined from both triaxiality and notchfabrication perspectives. Afterwards, a methodology is presented to extract the micromechanical-basedductile damage parameters from the load-displacement curve of notched SPT samples...

  9. Overload prevention in model supports for wind tunnel model testing

    Directory of Open Access Journals (Sweden)

    Anton IVANOVICI

    2015-09-01

    Full Text Available Preventing overloads in wind tunnel model supports is crucial to the integrity of the tested system. Results can only be interpreted as valid if the model support, conventionally called a sting remains sufficiently rigid during testing. Modeling and preliminary calculation can only give an estimate of the sting’s behavior under known forces and moments but sometimes unpredictable, aerodynamically caused model behavior can cause large transient overloads that cannot be taken into account at the sting design phase. To ensure model integrity and data validity an analog fast protection circuit was designed and tested. A post-factum analysis was carried out to optimize the overload detection and a short discussion on aeroelastic phenomena is included to show why such a detector has to be very fast. The last refinement of the concept consists in a fast detector coupled with a slightly slower one to differentiate between transient overloads that decay in time and those that are the result of aeroelastic unwanted phenomena. The decision to stop or continue the test is therefore conservatively taken preserving data and model integrity while allowing normal startup loads and transients to manifest.

  10. Movable scour protection. Model test report

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, R.

    2002-07-01

    This report presents the results of a series of model tests with scour protection of marine structures. The objective of the model tests is to investigate the integrity of the scour protection during a general lowering of the surrounding seabed, for instance in connection with movement of a sand bank or with general subsidence. The scour protection in the tests is made out of stone material. Two different fractions have been used: 4 mm and 40 mm. Tests with current, with waves and with combined current and waves were carried out. The scour protection material was placed after an initial scour hole has evolved in the seabed around the structure. This design philosophy has been selected because the situation often is that the scour hole starts to generate immediately after the structure has been placed. It is therefore difficult to establish a scour protection at the undisturbed seabed if the scour material is placed after the main structure. Further, placing the scour material in the scour hole increases the stability of the material. Two types of structure have been used for the test, a Monopile and a Tripod foundation. Test with protection mats around the Monopile model was also carried out. The following main conclusions have emerged form the model tests with flat bed (i.e. no general seabed lowering): 1. The maximum scour depth found in steady current on sand bed was 1.6 times the cylinder diameter, 2. The minimum horizontal extension of the scour hole (upstream direction) was 2.8 times the cylinder diameter, corresponding to a slope of 30 degrees, 3. Concrete protection mats do not meet the criteria for a strongly erodible seabed. In the present test virtually no reduction in the scour depth was obtained. The main problem is the interface to the cylinder. If there is a void between the mats and the cylinder, scour will develop. Even with the protection mats that are tightly connected to the cylinder, scour is expected to develop as long as the mats allow for

  11. Thurstonian models for sensory discrimination tests as generalized linear models

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Christensen, Rune Haubo Bojesen

    2010-01-01

    Sensory discrimination tests such as the triangle, duo-trio, 2-AFC and 3-AFC tests produce binary data and the Thurstonian decision rule links the underlying sensory difference 6 to the observed number of correct responses. In this paper it is shown how each of these four situations can be viewed...... as a so-called generalized linear model. The underlying sensory difference 6 becomes directly a parameter of the statistical model and the estimate d' and it's standard error becomes the "usual" output of the statistical analysis. The d' for the monadic A-NOT A method is shown to appear as a standard...... linear contrast in a generalized linear model using the probit link function. All methods developed in the paper are implemented in our free R-package sensR (http://www.cran.r-project.org/package=sensR/). This includes the basic power and sample size calculations for these four discrimination tests...

  12. Dynamic model of Fast Breeder Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Vaidyanathan, G., E-mail: vaidya@igcar.gov.i [Fast Reactor Technology Group, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Kasinathan, N.; Velusamy, K. [Fast Reactor Technology Group, Indira Gandhi Center for Atomic Research, Kalpakkam (India)

    2010-04-15

    Fast Breeder Test Reactor (FBTR) is a 40 M Wt/13.2 MWe sodium cooled reactor operating since 1985. It is a loop type reactor. As part of the safety analysis the response of the plant to various transients is needed. In this connection a computer code named DYNAM was developed to model the reactor core, the intermediate heat exchanger, steam generator, piping, etc. This paper deals with the mathematical model of the various components of FBTR, the numerical techniques to solve the model, and comparison of the predictions of the code with plant measurements. Also presented is the benign response of the plant to a station blackout condition, which brings out the role of the various reactivity feedback mechanisms combined with a gradual coast down of reactor sodium flow.

  13. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  14. Testing the Model of Oscillating Magnetic Traps

    Science.gov (United States)

    Szaforz, Ż.; Tomczak, M.

    2015-01-01

    The aim of this paper is to test the model of oscillating magnetic traps (the OMT model), proposed by Jakimiec and Tomczak ( Solar Phys. 261, 233, 2010). This model describes the process of excitation of quasi-periodic pulsations (QPPs) observed during solar flares. In the OMT model energetic electrons are accelerated within a triangular, cusp-like structure situated between the reconnection point and the top of a flare loop as seen in soft X-rays. We analyzed QPPs in hard X-ray light curves for 23 flares as observed by Yohkoh. Three independent methods were used. We also used hard X-ray images to localize magnetic traps and soft X-ray images to diagnose thermal plasmas inside the traps. We found that the majority of the observed pulsation periods correlates with the diameters of oscillating magnetic traps, as was predicted by the OMT model. We also found that the electron number density of plasma inside the magnetic traps in the time of pulsation disappearance is strongly connected with the pulsation period. We conclude that the observations are consistent with the predictions of the OMT model for the analyzed set of flares.

  15. Testing substellar models with dynamical mass measurements

    Directory of Open Access Journals (Sweden)

    Liu M.C.

    2011-07-01

    Full Text Available We have been using Keck laser guide star adaptive optics to monitor the orbits of ultracool binaries, providing dynamical masses at lower luminosities and temperatures than previously available and enabling strong tests of theoretical models. We have identified three specific problems with theory: (1 We find that model color–magnitude diagrams cannot be reliably used to infer masses as they do not accurately reproduce the colors of ultracool dwarfs of known mass. (2 Effective temperatures inferred from evolutionary model radii are typically inconsistent with temperatures derived from fitting atmospheric models to observed spectra by 100–300 K. (3 For the only known pair of field brown dwarfs with a precise mass (3% and age determination (≈25%, the measured luminosities are ~2–3× higher than predicted by model cooling rates (i.e., masses inferred from Lbol and age are 20–30% larger than measured. To make progress in understanding the observed discrepancies, more mass measurements spanning a wide range of luminosity, temperature, and age are needed, along with more accurate age determinations (e.g., via asteroseismology for primary stars with brown dwarf binary companions. Also, resolved optical and infrared spectroscopy are needed to measure lithium depletion and to characterize the atmospheres of binary components in order to better assess model deficiencies.

  16. Seepage Calibration Model and Seepage Testing Data

    Energy Technology Data Exchange (ETDEWEB)

    P. Dixon

    2004-02-17

    The purpose of this Model Report is to document the Seepage Calibration Model (SCM). The SCM is developed (1) to establish the conceptual basis for the Seepage Model for Performance Assessment (SMPA), and (2) to derive seepage-relevant, model-related parameters and their distributions for use in the SMPA and seepage abstraction in support of the Total System Performance Assessment for License Application (TSPA-LA). The SCM is intended to be used only within this Model Report for the estimation of seepage-relevant parameters through calibration of the model against seepage-rate data from liquid-release tests performed in several niches along the Exploratory Studies Facility (ESF) Main Drift and in the Cross Drift. The SCM does not predict seepage into waste emplacement drifts under thermal or ambient conditions. Seepage predictions for waste emplacement drifts under ambient conditions will be performed with the SMPA (see upcoming REV 02 of CRWMS M&O 2000 [153314]), which inherits the conceptual basis and model-related parameters from the SCM. Seepage during the thermal period is examined separately in the Thermal Hydrologic (TH) Seepage Model (see BSC 2003 [161530]). The scope of this work is (1) to evaluate seepage rates measured during liquid-release experiments performed in several niches in the Exploratory Studies Facility (ESF) and in the Cross Drift, which was excavated for enhanced characterization of the repository block (ECRB); (2) to evaluate air-permeability data measured in boreholes above the niches and the Cross Drift to obtain the permeability structure for the seepage model; (3) to use inverse modeling to calibrate the SCM and to estimate seepage-relevant, model-related parameters on the drift scale; (4) to estimate the epistemic uncertainty of the derived parameters, based on the goodness-of-fit to the observed data and the sensitivity of calculated seepage with respect to the parameters of interest; (5) to characterize the aleatory uncertainty of

  17. Model Tests of Pile Defect Detection

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The pile, as an important foundation style, is being used in engineering practice. Defects of different types and damages of different degrees easily occur during the process of pile construction. So,dietecting defects of the pile is very important. As so far, there are some difficult problems in pile defect detection. Based on stress wave theory, some of these typical difficult problems were studied through model tests. The analyses of the test results are carried out and some significant results of the low-strain method are obtained, when a pile has a gradually-decreasing crosssection part, the amplitude of the reflective signal originating from the defect is dependent on the decreasing value of the rate of crosssection β. No apparent signal reflected from the necking appeares on the velocity response curve when the value of β is less than about 3.5 %.

  18. 2-D Model Test of Dolosse Breakwater

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Liu, Zhou

    1994-01-01

    The rational design diagram for Dolos armour should incorporate both the hydraulic stability and the structural integrity. The previous tests performed by Aalborg University (AU) made available such design diagram for the trunk of Dolos breakwater without superstructures (Burcharth et al. 1992......). To extend the design diagram to cover Dolos breakwaters with superstructure, 2-D model tests of Dolos breakwater with wave wall is included in the project Rubble Mound Breakwater Failure Modes sponsored by the Directorate General XII of the Commission of the European Communities under Contract MAS-CT92......-0042. Furthermore, Task IA will give the design diagram for Tetrapod breakwaters without a superstructure. The more complete research results on Dolosse can certainly give some insight into the behaviour of Tetrapods armour layer of the breakwaters with superstructure. The main part of the experiment...

  19. Model-independent tests of cosmic gravity.

    Science.gov (United States)

    Linder, Eric V

    2011-12-28

    Gravitation governs the expansion and fate of the universe, and the growth of large-scale structure within it, but has not been tested in detail on these cosmic scales. The observed acceleration of the expansion may provide signs of gravitational laws beyond general relativity (GR). Since the form of any such extension is not clear, from either theory or data, we adopt a model-independent approach to parametrizing deviations to the Einstein framework. We explore the phase space dynamics of two key post-GR functions and derive a classification scheme, and an absolute criterion on accuracy necessary for distinguishing classes of gravity models. Future surveys will be able to constrain the post-GR functions' amplitudes and forms to the required precision, and hence reveal new aspects of gravitation.

  20. Modeling and testing of ethernet transformers

    Science.gov (United States)

    Bowen, David

    2011-12-01

    Twisted-pair Ethernet is now the standard home and office last-mile network technology. For decades, the IEEE standard that defines Ethernet has required electrical isolation between the twisted pair cable and the Ethernet device. So, for decades, every Ethernet interface has used magnetic core Ethernet transformers to isolate Ethernet devices and keep users safe in the event of a potentially dangerous fault on the network media. The current state-of-the-art Ethernet transformers are miniature (explored which are capable of exceptional miniaturization or on-chip fabrication. This dissertation thoroughly explores the performance of the current commercial Ethernet transformers to both increase understanding of the device's behavior and outline performance parameters for replacement devices. Lumped element and distributed circuit models are derived; testing schemes are developed and used to extract model parameters from commercial Ethernet devices. Transfer relation measurements of the commercial Ethernet transformers are compared against the model's behavior and it is found that the tuned, distributed models produce the best transfer relation match to the measured data. Process descriptions and testing results on fabricated thin-film dielectric-core toroid transformers are presented. The best results were found for a 32-turn transformer loaded with 100Ω, the impedance of twisted pair cable. This transformer gave a flat response from about 10MHz to 40MHz with a height of approximately 0.45. For the fabricated transformer structures, theoretical methods to determine resistance, capacitance and inductance are presented. A special analytical and numerical analysis of the fabricated transformer inductance is presented. Planar cuts of magnetic slope fields around the dielectric-core toroid are shown that describe the effect of core height and winding density on flux uniformity without a magnetic core.

  1. Seepage Calibration Model and Seepage Testing Data

    Energy Technology Data Exchange (ETDEWEB)

    S. Finsterle

    2004-09-02

    The purpose of this Model Report is to document the Seepage Calibration Model (SCM). The SCM was developed (1) to establish the conceptual basis for the Seepage Model for Performance Assessment (SMPA), and (2) to derive seepage-relevant, model-related parameters and their distributions for use in the SMPA and seepage abstraction in support of the Total System Performance Assessment for License Application (TSPA-LA). This Model Report has been revised in response to a comprehensive, regulatory-focused evaluation performed by the Regulatory Integration Team [''Technical Work Plan for: Regulatory Integration Evaluation of Analysis and Model Reports Supporting the TSPA-LA'' (BSC 2004 [DIRS 169653])]. The SCM is intended to be used only within this Model Report for the estimation of seepage-relevant parameters through calibration of the model against seepage-rate data from liquid-release tests performed in several niches along the Exploratory Studies Facility (ESF) Main Drift and in the Cross-Drift. The SCM does not predict seepage into waste emplacement drifts under thermal or ambient conditions. Seepage predictions for waste emplacement drifts under ambient conditions will be performed with the SMPA [''Seepage Model for PA Including Drift Collapse'' (BSC 2004 [DIRS 167652])], which inherits the conceptual basis and model-related parameters from the SCM. Seepage during the thermal period is examined separately in the Thermal Hydrologic (TH) Seepage Model [see ''Drift-Scale Coupled Processes (DST and TH Seepage) Models'' (BSC 2004 [DIRS 170338])]. The scope of this work is (1) to evaluate seepage rates measured during liquid-release experiments performed in several niches in the Exploratory Studies Facility (ESF) and in the Cross-Drift, which was excavated for enhanced characterization of the repository block (ECRB); (2) to evaluate air-permeability data measured in boreholes above the niches and the Cross

  2. Finds in Testing Experiments for Model Evaluation

    Institute of Scientific and Technical Information of China (English)

    WU Ji; JIA Xiaoxia; LIU Chang; YANG Haiyan; LIU Chao

    2005-01-01

    To evaluate the fault location and the failure prediction models, simulation-based and code-based experiments were conducted to collect the required failure data. The PIE model was applied to simulate failures in the simulation-based experiment. Based on syntax and semantic level fault injections, a hybrid fault injection model is presented. To analyze the injected faults, the difficulty to inject (DTI) and difficulty to detect (DTD) are introduced and are measured from the programs used in the code-based experiment. Three interesting results were obtained from the experiments: 1) Failures simulated by the PIE model without consideration of the program and testing features are unreliably predicted; 2) There is no obvious correlation between the DTI and DTD parameters; 3) The DTD for syntax level faults changes in a different pattern to that for semantic level faults when the DTI increases. The results show that the parameters have a strong effect on the failures simulated, and the measurement of DTD is not strict.

  3. Integrated outburst detector sensor-model tests

    Institute of Scientific and Technical Information of China (English)

    DZIURZY(N)SKI Wac(I)aw; WASILEWSKI Stanis(I)aw

    2011-01-01

    Outbursts of methane and rocks are,similarly to rock bursts,the biggest hazards in deep mines and are equally difficult to predict.The violent process of the outburst itself,along with the scale and range of hazards following the rapid discharge of gas and rocks,requires solutions which would enable quick and unambiguous detection of the hazard,immediate power supply cut-off and evacuation of personnel from potentially hazardous areas.For this purpose,an integrated outburst detector was developed.Assumed functions of the sensor which was equipped with three measuring and detection elements:a chamber for constant measurement of methane concentration,pressure sensor and microphone.Tests of the sensor model were carried out to estimate the parameters which characterize the dynamic properties of the sensor.Given the impossibility of carrying out the full scale experimental outburst,the sensor was tested during the methane and coal dust explosions in the testing gallery at KD Barbara.The obtained results proved that the applied solutions have been appropriate.

  4. Applying Model Checking to Generate Model-Based Integration Tests from Choreography Models

    Science.gov (United States)

    Wieczorek, Sebastian; Kozyura, Vitaly; Roth, Andreas; Leuschel, Michael; Bendisposto, Jens; Plagge, Daniel; Schieferdecker, Ina

    Choreography models describe the communication protocols between services. Testing of service choreographies is an important task for the quality assurance of service-based systems as used e.g. in the context of service-oriented architectures (SOA). The formal modeling of service choreographies enables a model-based integration testing (MBIT) approach. We present MBIT methods for our service choreography modeling approach called Message Choreography Models (MCM). For the model-based testing of service choreographies, MCMs are translated into Event-B models and used as input for our test generator which uses the model checker ProB.

  5. Experimentally testing the standard cosmological model

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (USA) Fermi National Accelerator Lab., Batavia, IL (USA))

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, {Omega}{sub b}, remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that {Omega}{sub b} {approximately} 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming {Omega}{sub total} = 1) and the need for dark baryonic matter, since {Omega}{sub visible} < {Omega}{sub b}. Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M{sub x} {approx gt} 20 GeV and an interaction weaker than the Z{sup 0} coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for {nu}-masses may imply that the {nu}{sub {tau}} is a good hot dark matter candidate. 73 refs., 5 figs.

  6. Ablative Rocket Deflector Testing and Computational Modeling

    Science.gov (United States)

    Allgood, Daniel C.; Lott, Jeffrey W.; Raines, Nickey

    2010-01-01

    A deflector risk mitigation program was recently conducted at the NASA Stennis Space Center. The primary objective was to develop a database that characterizes the behavior of industry-grade refractory materials subjected to rocket plume impingement conditions commonly experienced on static test stands. The program consisted of short and long duration engine tests where the supersonic exhaust flow from the engine impinged on an ablative panel. Quasi time-dependent erosion depths and patterns generated by the plume impingement were recorded for a variety of different ablative materials. The erosion behavior was found to be highly dependent on the material s composition and corresponding thermal properties. For example, in the case of the HP CAST 93Z ablative material, the erosion rate actually decreased under continued thermal heating conditions due to the formation of a low thermal conductivity "crystallization" layer. The "crystallization" layer produced near the surface of the material provided an effective insulation from the hot rocket exhaust plume. To gain further insight into the complex interaction of the plume with the ablative deflector, computational fluid dynamic modeling was performed in parallel to the ablative panel testing. The results from the current study demonstrated that locally high heating occurred due to shock reflections. These localized regions of shock-induced heat flux resulted in non-uniform erosion of the ablative panels. In turn, it was observed that the non-uniform erosion exacerbated the localized shock heating causing eventual plume separation and reversed flow for long duration tests under certain conditions. Overall, the flow simulations compared very well with the available experimental data obtained during this project.

  7. Solar system tests of brane world models

    CERN Document Server

    Boehmer, Christian G; Lobo, Francisco S N

    2008-01-01

    The classical tests of general relativity (perihelion precession, deflection of light, and the radar echo delay) are considered for the Dadhich, Maartens, Papadopoulos and Rezania (DMPR) solution of the spherically symmetric static vacuum field equations in brane world models. For this solution the metric in the vacuum exterior to a brane world star is similar to the Reissner-Nordstrom form of classical general relativity, with the role of the charge played by the tidal effects arising from projections of the fifth dimension. The existing observational solar system data on the perihelion shift of Mercury, on the light bending around the Sun (obtained using long-baseline radio interferometry), and ranging to Mars using the Viking lander, constrain the numerical values of the bulk tidal parameter and of the brane tension.

  8. Solar system tests of brane world models

    Energy Technology Data Exchange (ETDEWEB)

    Boehmer, Christian G [Department of Mathematics, University College London, Gower Street, London WC1E 6BT (United Kingdom); Harko, Tiberiu [Department of Physics and Center for Theoretical and Computational Physics, University of Hong Kong, Pok Fu Lam Road (Hong Kong); Lobo, Francisco S N [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth PO1 2EG (United Kingdom)], E-mail: c.boehmer@ucl.ac.uk, E-mail: harko@hkucc.hku.hk, E-mail: francisco.lobo@port.ac.uk

    2008-02-21

    The classical tests of general relativity (perihelion precession, deflection of light and the radar echo delay) are considered for the Dadhich, Maartens, Papadopoulos and Rezania (DMPR) solution of the spherically symmetric static vacuum field equations in brane world models. For this solution the metric in the vacuum exterior to a brane world star is similar to the Reissner-Nordstroem form of classical general relativity, with the role of the charge played by the tidal effects arising from projections of the fifth dimension. The existing observational solar system data on the perihelion shift of Mercury, on the light bending around the Sun (obtained using long-baseline radio interferometry), and ranging to Mars using the Viking lander, constrain the numerical values of the bulk tidal parameter and of the brane tension.

  9. Two Bayesian tests of the GLOMOsys Model.

    Science.gov (United States)

    Field, Sarahanne M; Wagenmakers, Eric-Jan; Newell, Ben R; Zeelenberg, René; van Ravenzwaaij, Don

    2016-12-01

    Priming is arguably one of the key phenomena in contemporary social psychology. Recent retractions and failed replication attempts have led to a division in the field between proponents and skeptics and have reinforced the importance of confirming certain priming effects through replication. In this study, we describe the results of 2 preregistered replication attempts of 1 experiment by Förster and Denzler (2012). In both experiments, participants first processed letters either globally or locally, then were tested using a typicality rating task. Bayes factor hypothesis tests were conducted for both experiments: Experiment 1 (N = 100) yielded an indecisive Bayes factor of 1.38, indicating that the in-lab data are 1.38 times more likely to have occurred under the null hypothesis than under the alternative. Experiment 2 (N = 908) yielded a Bayes factor of 10.84, indicating strong support for the null hypothesis that global priming does not affect participants' mean typicality ratings. The failure to replicate this priming effect challenges existing support for the GLOMO(sys) model. (PsycINFO Database Record

  10. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  11. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the

  12. Designing healthy communities: Testing the walkability model

    Directory of Open Access Journals (Sweden)

    Adriana A. Zuniga-Teran

    2017-03-01

    Full Text Available Research from multiple domains has provided insights into how neighborhood design can be improved to have a more favorable effect on physical activity, a concept known as walkability. The relevant research findings/hypotheses have been integrated into a Walkability Framework, which organizes the design elements into nine walkability categories. The purpose of this study was to test whether this conceptual framework can be used as a model to measure the interactions between the built environment and physical activity. We explored correlations between the walkability categories and physical activity reported through a survey of residents of Tucson, Arizona (n=486. The results include significant correlations between the walkability categories and physical activity as well as between the walkability categories and the two motivations for walking (recreation and transportation. To our knowledge, this is the first study that reports links between walkability and walking for recreation. Additionally, the use of the Walkability Framework allowed us to identify the walkability categories most strongly correlated with the two motivations for walking. The results of this study support the use of the Walkability Framework as a model to measure the built environment in relation to its ability to promote physical activity.

  13. Computerized classification testing with the Rasch model

    NARCIS (Netherlands)

    Eggen, Theo J.H.M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the

  14. Accelerated testing statistical models, test plans, and data analysis

    CERN Document Server

    Nelson, Wayne B

    2009-01-01

    The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. "". . . a goldmine of knowledge on accelerated life testing principles and practices . . . one of the very few capable of advancing the science of reliability. It definitely belongs in every bookshelf on engineering.""-Dev G.

  15. Putting hydrological modelling practice to the test

    NARCIS (Netherlands)

    Melsen, Lieke Anna

    2017-01-01

    Six steps can be distinguished in the process of hydrological modelling: the perceptual model (deciding on the processes), the conceptual model (deciding on the equations), the procedural model (get the code to run on a computer), calibration (identify the parameters), evaluation (confronting output

  16. Hydraulic model tests on modified Wave Dragon. Phase 3

    Energy Technology Data Exchange (ETDEWEB)

    Hald, T.; Lynggaard, J.

    2002-11-01

    The purpose of this report is to describe the model tests conducted with a new designed 2. generation WD model as well as obtained model test results. Tests are conducted as sequential reconstruction followed by physical model tests. All details concerning the reconstruction are found in Hald and Lynggaard (2001). Model tests and reconstruction are carried out during the phase 3 project: 'Wave Dragon. Reconstruction of an existing model in scale 1:50 and sequential tests of changes to the model geometry and mass distribution parameters' sponsored by the Danish Energy Agency (DEA) wave energy programme. The tests will establish a well documented basis for the development of a 1:4.5 scale prototype planned for testing Nissum Bredning, a sea inlet on the Danish West Coast. (au)

  17. Testing AGN feedback models in galaxy evolution

    Science.gov (United States)

    Shin, Min-Su

    Galaxy formation and evolution have been one of the most challenging problems in astrophysics. A single galaxy has various components (stars, atomic and molecular gas, a supermassive black hole, and dark matter) and has interacted with its cosmic environment throughout its history. A key issue in understanding galaxy evolution is to find the dominant physical processes in the interactions between the components of a galaxy and between a galaxy and its environment. AGN feedback has been proposed as a key process to suppress late star formation in massive elliptical galaxies and as a general consequence of galaxy mergers and interactions. In this thesis, I investigate feedback effects from active galactic nuclei (AGN) using a new simulation code and data from the Sloan Digital Sky Survey. In the first chapter, I test purely mechanical AGN feedback models via a nuclear wind around the central SMBH in elliptical galaxies by comparing simulation results to four well-defined observational constraints: the mass ratio between the SMBH and its host galaxy, the lifetime of the quasar phase, the X-ray luminosity from the hot interstellar medium, and the mass fraction of young stars. Even though purely mechanical AGN feedback is commonly assumed in cosmological simulations, I find that it is inadequate, and cannot reproduce all four observational constraints simultaneously. This result suggests that both mechanical and radiative feedback modes are important physical processes. In the second chapter, I simulate the coevolution of the SMBH and its host galaxy under different environments, represented by different amounts of gas stripping. Though the connection between environment and galaxy evolution has been well-studied, environmental effects on the growth of the SMBH have not been answered yet. I find that strong gas stripping, which satellite galaxies might experience, highly suppresses SMBH mass accretion and AGN activity. Moreover, the suppression of the SMBH growth is

  18. The Internationalization of Testing and New Models of Test Delivery on the Internet

    Science.gov (United States)

    Bartram, Dave

    2006-01-01

    The Internet has opened up a whole new set of opportunities for advancing the science of psychometrics and the technology of testing. It has also created some new challenges for those of us involved in test design and testing. In particular, we are seeing impacts from internationalization of testing and new models for test delivery. These are…

  19. Methods and models for the construction of weakly parallel tests

    NARCIS (Netherlands)

    Adema, Jos J.

    1992-01-01

    Several methods are proposed for the construction of weakly parallel tests [i.e., tests with the same test information function (TIF)]. A mathematical programming model that constructs tests containing a prespecified TIF and a heuristic that assigns items to tests with information functions that are

  20. TESTING FOR VARYING DISPERSION IN DISCRETE EXPONENTIAL FAMILY NONLINEAR MODELS

    Institute of Scientific and Technical Information of China (English)

    LinJinguan; WeiBocheng; ZhangNansong

    2003-01-01

    It is necessary to test for varying dispersion in generalized nonlinear models. Wei ,et al(1998) developed a likelihood ratio test,a score test and their adjustments to test for varying dispersion in continuous exponential family nonlinear models. This type of problem in the framework of general discrete exponential family nonlinear models is discussed. Two types of varying dispersion, which are random coefficients model and random effects model, are proposed,and corresponding score test statistics are constructed and expressed in simple ,easy to use ,matrix formulas.

  1. State of the art hydraulic turbine model test

    Science.gov (United States)

    Fabre, Violaine; Duparchy, Alexandre; Andre, Francois; Larroze, Pierre-Yves

    2016-11-01

    Model tests are essential in hydraulic turbine development and related fields. The methods and technologies used to perform these tests show constant progress and provide access to further information. In addition, due to its contractual nature, the test demand evolves continuously in terms of quantity and accuracy. Keeping in mind that the principal aim of model testing is the transposition of the model measurements to the real machine, the measurements should be performed accurately, and a critical analysis of the model test results is required to distinguish the transposable hydraulic phenomena from the test rig interactions. Although the resonances’ effects are known and described in the IEC standard, their identification is difficult. Leaning on a strong experience of model testing, we will illustrate with a few examples of how to identify the potential problems induced by the test rig. This paper contains some of our best practices to obtain the most accurate, relevant, and independent test-rig measurements.

  2. Optimization models for flight test scheduling

    Science.gov (United States)

    Holian, Derreck

    As threats around the world increase with nations developing new generations of warfare technology, the Unites States is keen on maintaining its position on top of the defense technology curve. This in return indicates that the U.S. military/government must research, develop, procure, and sustain new systems in the defense sector to safeguard this position. Currently, the Lockheed Martin F-35 Joint Strike Fighter (JSF) Lightning II is being developed, tested, and deployed to the U.S. military at Low Rate Initial Production (LRIP). The simultaneous act of testing and deployment is due to the contracted procurement process intended to provide a rapid Initial Operating Capability (IOC) release of the 5th Generation fighter. For this reason, many factors go into the determination of what is to be tested, in what order, and at which time due to the military requirements. A certain system or envelope of the aircraft must be assessed prior to releasing that capability into service. The objective of this praxis is to aide in the determination of what testing can be achieved on an aircraft at a point in time. Furthermore, it will define the optimum allocation of test points to aircraft and determine a prioritization of restrictions to be mitigated so that the test program can be best supported. The system described in this praxis has been deployed across the F-35 test program and testing sites. It has discovered hundreds of available test points for an aircraft to fly when it was thought none existed thus preventing an aircraft from being grounded. Additionally, it has saved hundreds of labor hours and greatly reduced the occurrence of test point reflight. Due to the proprietary nature of the JSF program, details regarding the actual test points, test plans, and all other program specific information have not been presented. Generic, representative data is used for example and proof-of-concept purposes. Apart from the data correlation algorithms, the optimization associated

  3. Horns Rev II, 2D-Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Brorsen, Michael

    This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU), Denmark. The starting point for the present report is the previously carried out run-up tests described in Lykke Andersen & Frigaard, 2006......-shaped access platforms on piles. The Model tests include mainly regular waves and a few irregular wave tests. These tests have been conducted at Aalborg University from 9. November, 2006 to 17. November, 2006....

  4. Testing for Causality in Variance Usinf Multivariate GARCH Models

    OpenAIRE

    Christian M. Hafner; Herwartz, Helmut

    2008-01-01

    Tests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently, little is known about their power properties. In this paper we show that a convenient alternative to residual based testing is to specify a multivariate volatility model, such as multivariate GARCH (or BEKK), and construct a Wald test on noncausality in variance. We compare both approaches to testing causality in var...

  5. Testing for causality in variance using multivariate GARCH models

    OpenAIRE

    Hafner, Christian; Herwartz, H.

    2004-01-01

    textabstractTests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently little is known about their power properties. In this paper we show that a convenient alternative to residual based testing is to specify a multivariate volatility model, such as multivariate GARCH (or BEKK), and construct a Wald test on noncausality in variance. We compare both approaches to testing causa...

  6. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  7. A Lagrange Multiplier Test for Testing the Adequacy of the Constant Conditional Correlation GARCH Model

    DEFF Research Database (Denmark)

    Catani, Paul; Teräsvirta, Timo; Yin, Meiqun

    A Lagrange multiplier test for testing the parametric structure of a constant conditional correlation generalized autoregressive conditional heteroskedasticity (CCC-GARCH) model is proposed. The test is based on decomposing the CCC-GARCH model multiplicatively into two components, one of which...

  8. 2-D Model Test Study of the Suape Breakwater, Brazil

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Burcharth, Hans F.; Sopavicius, A.;

    This report deals with a two-dimensional model test study of the extension of the breakwater in Suape, Brazil. One cross-section was tested for stability and overtopping in various sea conditions. The length scale used for the model tests was 1:35. Unless otherwise specified all values given...

  9. Airship Model Tests in the Variable Density Wind Tunnel

    Science.gov (United States)

    Abbott, Ira H

    1932-01-01

    This report presents the results of wind tunnel tests conducted to determine the aerodynamic characteristics of airship models. Eight Goodyear-Zeppelin airship models were tested in the original closed-throat tunnel. After the tunnel was rebuilt with an open throat a new model was tested, and one of the Goodyear-Zeppelin models was retested. The results indicate that much may be done to determine the drag of airships from evaluations of the pressure and skin-frictional drags on models tested at large Reynolds number.

  10. ADBT Frame Work as a Testing Technique: An Improvement in Comparison with Traditional Model Based Testing

    Directory of Open Access Journals (Sweden)

    Mohammed Akour

    2016-05-01

    Full Text Available Software testing is an embedded activity in all software development life cycle phases. Due to the difficulties and high costs of software testing, many testing techniques have been developed with the common goal of testing software in the most optimal and cost-effective manner. Model-based testing (MBT is used to direct testing activities such as test verification and selection. MBT is employed to encapsulate and understand the behavior of the system under test, which supports and helps software engineers to validate the system with various likely actions. The widespread usage of models has influenced the usage of MBT in the testing process, especially with UML. In this research, we proposed an improved model based testing strategy, which involves and uses four different diagrams in the testing process. This paper also discusses and explains the activities in the proposed model with the finite state model (FSM. The comparisons have been done with traditional model based testings in terms of test case generation and result.

  11. Observational Tests of Planet Formation Models

    CERN Document Server

    Sozzetti, A; Latham, D W; Carney, B W; Laird, J B; Stefanik, R P; Boss, A P; Charbonneau, D; O'Donovan, F T; Holman, M J; Winn, J N

    2007-01-01

    We summarize the results of two experiments to address important issues related to the correlation between planet frequencies and properties and the metallicity of the hosts. Our results can usefully inform formation, structural, and evolutionary models of gas giant planets.

  12. A Model for Random Student Drug Testing

    Science.gov (United States)

    Nelson, Judith A.; Rose, Nancy L.; Lutz, Danielle

    2011-01-01

    The purpose of this case study was to examine random student drug testing in one school district relevant to: (a) the perceptions of students participating in competitive extracurricular activities regarding drug use and abuse; (b) the attitudes and perceptions of parents, school staff, and community members regarding student drug involvement; (c)…

  13. Regression Test-Selection Technique Using Component Model Based Modification: Code to Test Traceability

    Directory of Open Access Journals (Sweden)

    Ahmad A. Saifan

    2016-04-01

    Full Text Available Regression testing is a safeguarding procedure to validate and verify adapted software, and guarantee that no errors have emerged. However, regression testing is very costly when testers need to re-execute all the test cases against the modified software. This paper proposes a new approach in regression test selection domain. The approach is based on meta-models (test models and structured models to decrease the number of test cases to be used in the regression testing process. The approach has been evaluated using three Java applications. To measure the effectiveness of the proposed approach, we compare the results using the re-test to all approaches. The results have shown that our approach reduces the size of test suite without negative impact on the effectiveness of the fault detection.

  14. Model Testing - Bringing the Ocean into the Laboratory

    DEFF Research Database (Denmark)

    Aage, Christian

    2000-01-01

    Hydrodynamic model testing, the principle of bringing the ocean into the laboratory to study the behaviour of the ocean itself and the response of man-made structures in the ocean in reduced scale, has been known for centuries. Due to an insufficient understanding of the physics involved, however......, the early model tests often gave incomplete or directly misleading results.This keynote lecture deals with some of the possibilities and problems within the field of hydrodynamic and hydraulic model testing....

  15. Landing Procedure in Model Ditching Tests of Bf 109

    Science.gov (United States)

    Sottorf, W.

    1949-01-01

    The purpose of the model tests is to clarify the motions in the alighting on water of a land plane. After discussion of the model laws, the test method and test procedure are described. The deceleration-time-diagrams of the landing of a model of the Bf 109 show a high deceleration peek of greater than 20g which can be lowered to 4 to 6g by radiator cowling and brake skid.

  16. Testing turbulent closure models with convection simulations

    CERN Document Server

    Snellman, J E; Mantere, M J; Rheinhardt, M; Dintrans, B

    2012-01-01

    Aims: To compare simple analytical closure models of turbulent Boussinesq convection for stellar applications with direct three-dimensional simulations both in homogeneous and inhomogeneous (bounded) setups. Methods: We use simple analytical closure models to compute the fluxes of angular momentum and heat as a function of rotation rate measured by the Taylor number. We also investigate cases with varying angles between the angular velocity and gravity vectors, corresponding to locating the computational domain at different latitudes ranging from the pole to the equator of the star. We perform three-dimensional numerical simulations in the same parameter regimes for comparison. The free parameters appearing in the closure models are calibrated by two fit methods using simulation data. Unique determination of the closure parameters is possible only in the non-rotating case and when the system is placed at the pole. In the other cases the fit procedures yield somewhat differing results. The quality of the closu...

  17. A Test of the Combined Effects Model.

    Science.gov (United States)

    Kimsey, William D.; And Others

    Two waves of telephone interviews with a sample of 141 voters were used in a study of political communication effects during the 1974 congressional election in the Illinois 24th Congressional District. Seven variables specified by the combined-effects model were derived from the interviews and factor analyzed. Two factors were found and…

  18. TESTS FOR VARIANCE COMPONENTS IN VARYING COEFFICIENT MIXED MODELS

    National Research Council Canada - National Science Library

    Zaixing Li; Yuedong Wang; Ping Wu; Wangli Xu; Lixing Zhu

    2012-01-01

    .... To address the question of whether a varying coefficient mixed model can be reduced to a simpler varying coefficient model, we develop one-sided tests for the null hypothesis that all the variance components are zero...

  19. Mixed Portmanteau Test for Diagnostic Checking of Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2014-01-01

    Full Text Available Model criticism is an important stage of model building and thus goodness of fit tests provides a set of tools for diagnostic checking of the fitted model. Several tests are suggested in literature for diagnostic checking. These tests use autocorrelation or partial autocorrelation in the residuals to criticize the adequacy of fitted model. The main idea underlying these portmanteau tests is to identify if there is any dependence structure which is yet unexplained by the fitted model. In this paper, we suggest mixed portmanteau tests based on autocorrelation and partial autocorrelation functions of the residuals. We derived the asymptotic distribution of the mixture test and studied its size and power using Monte Carlo simulations.

  20. Thermal modelling of Advanced LIGO test masses

    OpenAIRE

    Wang, Haoyu; Blair, Carl; Álvarez, Miguel Dovale; Brooks, Aidan; Kasprzack, Marie F.; Ramette, Joshua; Meyers, Patrick M.; Kaufer, Steffen; O'Reilly, Brian; Mow-Lowry, Conor M.; Freise, Andreas

    2016-01-01

    High-reflectivity fused silica mirrors are at the epicentre of today's advanced gravitational wave detectors. In these detectors, the mirrors interact with high power laser beams. As a result of finite absorption in the high reflectivity coatings the mirrors suffer from a variety of thermal effects that impact on the detectors' performance. We propose a model of the Advanced LIGO mirrors that introduces an empirical term to account for the radiative heat transfer between the mirror and its su...

  1. Modeling Reliability Growth in Accelerated Stress Testing

    Science.gov (United States)

    2013-12-01

    projection models for both continuous use and discrete use systems found anywhere in the literature. The review comprises a synopsis of over 80...pertaining to the research that may have been unfamiliar to the reader. The Chapter has provided a synopsis of the research accomplished in the fields of...Cox, "Analysis of the probability and risk of cause specific failure," International Journal of Radiology Oncology, Biology, Physics, vol. 29, no. 5

  2. A magnetorheological actuation system: test and model

    Science.gov (United States)

    John, Shaju; Chaudhuri, Anirban; Wereley, Norman M.

    2008-04-01

    Self-contained actuation systems, based on frequency rectification of the high frequency motion of an active material, can produce high force and stroke output. Magnetorheological (MR) fluids are active fluids whose rheological properties can be altered by the application of a magnetic field. By using MR fluids as the energy transmission medium in such hybrid devices, a valving system with no moving parts can be implemented and used to control the motion of an output cylinder shaft. The MR fluid based valves are configured in the form of an H-bridge to produce bi-directional motion in an output cylinder by alternately applying magnetic fields in the two opposite arms of the bridge. The rheological properties of the MR fluid are modeled using both Bingham plastic and bi-viscous models. In this study, the primary actuation is performed using a compact terfenol-D rod driven pump and frequency rectification of the rod motion is done using passive reed valves. The pump and reed valve configuration along with MR fluidic valves form a compact hydraulic actuation system. Actuator design, analysis and experimental results are presented in this paper. A time domain model of the actuator is developed and validated using experimental data.

  3. Testing the planetary models of HU Aquarii

    CERN Document Server

    Bours, M; Breedt, E; Copperwheat, C; Dhillon, V; Leckngam, A; Littlefair, S; Parsons, S; Prasit, A

    2014-01-01

    We present new eclipse observations of the polar (i.e. semi-detached magnetic white dwarf + M-dwarf binary) HU Aqr, and mid-egress times for each eclipse, which continue to be observed increasingly early. Recent eclipses occurred more than 70 seconds earlier than the prediction from the latest model that invoked a single circumbinary planet to explain the observed orbital period variations, thereby conclusively proving this model to be incorrect. Using ULTRACAM data, we show that mid-egress times determined for simultaneous data taken at different wavelengths agree with each other. The large variations in the observed eclipse times cannot be explained by planetary models containing up to three planets, because of poor fits to the data as well as orbital instability on short time scales. The peak-to-peak amplitude of the O-C diagram of almost 140 seconds is also too great to be caused by Applegate's mechanism, movement of the accretion spot on the surface of the white dwarf, or by asynchronous rotation of the ...

  4. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship is est......-run growth rate of GDP per worker converges to between zero and 1.1 pct....

  5. Tests of Hypotheses Arising In the Correlated Random Coefficient Model.

    Science.gov (United States)

    Heckman, James J; Schmierer, Daniel

    2010-11-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model.

  6. Upgraded Analytical Model of the Cylinder Test

    Energy Technology Data Exchange (ETDEWEB)

    Souers, P. Clark; Lauderbach, Lisa; Garza, Raul; Ferranti, Louis; Vitello, Peter

    2013-03-15

    A Gurney-type equation was previously corrected for wall thinning and angle of tilt, and now we have added shock wave attenuation in the copper wall and air gap energy loss. Extensive calculations were undertaken to calibrate the two new energy loss mechanisms across all explosives. The corrected Gurney equation is recommended for cylinder use over the original 1943 form. The effect of these corrections is to add more energy to the adiabat values from a relative volume of 2 to 7, with low energy explosives having the largest correction. The data was pushed up to a relative volume of about 15 and the JWL parameter ω was obtained directly. The total detonation energy density was locked to the v=7 adiabat energy density, so that the Cylinder test gives all necessary values needed to make a JWL.

  7. Upgraded Analytical Model of the Cylinder Test

    Energy Technology Data Exchange (ETDEWEB)

    Souers, P. Clark [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Lauderbach, Lisa [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Garza, Raul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Ferranti, Louis [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center; Vitello, Peter [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Energetic Materials Center

    2013-03-15

    A Gurney-type equation was previously corrected for wall thinning and angle of tilt, and now we have added shock wave attenuation in the copper wall and air gap energy loss. Extensive calculations were undertaken to calibrate the two new energy loss mechanisms across all explosives. The corrected Gurney equation is recommended for cylinder use over the original 1943 form. The effect of these corrections is to add more energy to the adiabat values from a relative volume of 2 to 7, with low energy explosives having the largest correction. The data was pushed up to a relative volume of about 15 and the JWL parameter ω was obtained directly. Finally, the total detonation energy density was locked to the v = 7 adiabat energy density, so that the Cylinder test gives all necessary values needed to make a JWL.

  8. Multicomponent Equilibrium Models for Testing Geothermometry Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Carl D. Palmer; Robert W. Smith; Travis L. McLing

    2013-02-01

    Geothermometry is an important tool for estimating deep reservoir temperature from the geochemical composition of shallower and cooler waters. The underlying assumption of geothermometry is that the waters collected from shallow wells and seeps maintain a chemical signature that reflects equilibrium in the deeper reservoir. Many of the geothermometers used in practice are based on correlation between water temperatures and composition or using thermodynamic calculations based a subset (typically silica, cations or cation ratios) of the dissolved constituents. An alternative approach is to use complete water compositions and equilibrium geochemical modeling to calculate the degree of disequilibrium (saturation index) for large number of potential reservoir minerals as a function of temperature. We have constructed several “forward” geochemical models using The Geochemist’s Workbench to simulate the change in chemical composition of reservoir fluids as they migrate toward the surface. These models explicitly account for the formation (mass and composition) of a steam phase and equilibrium partitioning of volatile components (e.g., CO2, H2S, and H2) into the steam as a result of pressure decreases associated with upward fluid migration from depth. We use the synthetic data generated from these simulations to determine the advantages and limitations of various geothermometry and optimization approaches for estimating the likely conditions (e.g., temperature, pCO2) to which the water was exposed in the deep subsurface. We demonstrate the magnitude of errors that can result from boiling, loss of volatiles, and analytical error from sampling and instrumental analysis. The estimated reservoir temperatures for these scenarios are also compared to conventional geothermometers. These results can help improve estimation of geothermal resource temperature during exploration and early development.

  9. Port Adriano, 2D-Model tests

    DEFF Research Database (Denmark)

    Burcharth, Hans F.; Meinert, Palle; Andersen, Thomas Lykke

    the crown wall have been measured. The model has been subjected to irregular waves corresponding to typical conditions offshore from the intended prototype location. Characteristic situations have been video recorded. The stability of the toe has been investigated. The wave-generated forces on the caisson...... and the crown have been recorded. The maximum of horizontal wave force and the related tilting moment together with the pressure distribution are documented for waves in the range of design conditions. The parameters and results in the report are given in full-scale values, if nothing else is stated....

  10. A Space Radiation Test Model Study

    Science.gov (United States)

    1989-02-17

    approxiniationis fi iso - the course of the ssork. tit, other criterion was applied to the latirig the dominant contributions to the right-hand side of...these Iso end gics (t 17, ; .i rli rlc population ire assUrnd sufficietitv issues, at discussion oif eachi is necessary at the outset high rhir i7...Lopez, J. Geophys. Res., 92, 13485 , 1987. B-16 GEOPHYSICAL RESEARCH LETTERS, VOL. 14, NO. 11, PACES 1166-1169, NDVEM13ER 1981 A NONLINEAR MODEL OF WAVE

  11. Petroleum reservoir data for testing simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Lloyd, J.M.; Harrison, W.

    1980-09-01

    This report consists of reservoir pressure and production data for 25 petroleum reservoirs. Included are 5 data sets for single-phase (liquid) reservoirs, 1 data set for a single-phase (liquid) reservoir with pressure maintenance, 13 data sets for two-phase (liquid/gas) reservoirs and 6 for two-phase reservoirs with pressure maintenance. Also given are ancillary data for each reservoir that could be of value in the development and validation of simulation models. A bibliography is included that lists the publications from which the data were obtained.

  12. Steel Containment Vessel Model Test: Results and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Costello, J.F.; Hashimote, T.; Hessheimer, M.F.; Luk, V.K.

    1999-03-01

    A high pressure test of the steel containment vessel (SCV) model was conducted on December 11-12, 1996 at Sandia National Laboratories, Albuquerque, NM, USA. The test model is a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of an improved Mark II boiling water reactor (BWR) containment. A concentric steel contact structure (CS), installed over the SCV model and separated at a nominally uniform distance from it, provided a simplified representation of a reactor shield building in the actual plant. The SCV model and contact structure were instrumented with strain gages and displacement transducers to record the deformation behavior of the SCV model during the high pressure test. This paper summarizes the conduct and the results of the high pressure test and discusses the posttest metallurgical evaluation results on specimens removed from the SCV model.

  13. Model-based robustness testing for avionics-embedded software

    Institute of Scientific and Technical Information of China (English)

    Yang Shunkun; Liu Bin; Wang Shihai; Lu Minyan

    2013-01-01

    Robustness testing for safety-critical embedded software is still a challenge in its nascent stages.In this paper,we propose a practical methodology and implement an environment by employing model-based robustness testing for embedded software systems.It is a system-level black-box testing approach in which the fault behaviors of embedded software is triggered with the aid of modelbased fault injection by the support of an executable model-driven hardware-in-loop (HIL) testing environment.The prototype implementation of the robustness testing environment based on the proposed approach is experimentally discussed and illustrated by industrial case studies based on several avionics-embedded software systems.The results show that our proposed and implemented robustness testing method and environment are effective to find more bugs,and reduce burdens of testing engineers to enhance efficiency of testing tasks,especially for testing complex embedded systems.

  14. Engineering model development and test results

    Science.gov (United States)

    Wellman, John A.

    1993-08-01

    The correctability of the primary mirror spherical error in the Wide Field/Planetary Camera (WF/PC) is sensitive to the precise alignment of the incoming aberrated beam onto the corrective elements. Articulating fold mirrors that provide +/- 1 milliradian of tilt in 2 axes are required to allow for alignment corrections in orbit as part of the fix for the Hubble space telescope. An engineering study was made by Itek Optical Systems and the Jet Propulsion Laboratory (JPL) to investigate replacement of fixed fold mirrors within the existing WF/PC optical bench with articulating mirrors. The study contract developed the base line requirements, established the suitability of lead magnesium niobate (PMN) actuators and evaluated several tilt mechanism concepts. Two engineering model articulating mirrors were produced to demonstrate the function of the tilt mechanism to provide +/- 1 milliradian of tilt, packaging within the space constraints and manufacturing techniques including the machining of the invar tilt mechanism and lightweight glass mirrors. The success of the engineering models led to the follow on design and fabrication of 3 flight mirrors that have been incorporated into the WF/PC to be placed into the Hubble Space Telescope as part of the servicing mission scheduled for late 1993.

  15. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    CERN Document Server

    Kanstrén, Teemu; 10.4204/EPTCS.80.5

    2012-01-01

    We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java) programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained) parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  16. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    Directory of Open Access Journals (Sweden)

    Teemu Kanstrén

    2012-02-01

    Full Text Available We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  17. Applications of the Linear Logistic Test Model in Psychometric Research

    Science.gov (United States)

    Kubinger, Klaus D.

    2009-01-01

    The linear logistic test model (LLTM) breaks down the item parameter of the Rasch model as a linear combination of some hypothesized elementary parameters. Although the original purpose of applying the LLTM was primarily to generate test items with specified item difficulty, there are still many other potential applications, which may be of use…

  18. A Bootstrap Cointegration Rank Test for Panels of VAR Models

    DEFF Research Database (Denmark)

    Callot, Laurent

    functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...

  19. Development of computer simulation models for pedestrian subsystem impact tests

    NARCIS (Netherlands)

    Kant, R.; Konosu, A.; Ishikawa, H.

    2000-01-01

    The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a

  20. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  1. Testing of IDA’s CBStrike Model

    Science.gov (United States)

    2011-05-01

    CBStrike is a modeling tool developed by the Institute for Defense Analyses (IDA) to stage chemical or biological attacks against specific force...used to determine number of times UAS has completed a right hand semicircle  fright tstep : Floor tstep  L ti  rp2vs 2ruvs  rp vs... fright tstep 2 ru vs  rp vs  vs ru    rp 2   xs, L ti  rp2vs  fright tstep 2 ru vs  rp vs  tstep  L ti  rp2vs

  2. Classical Ising model test for quantum circuits

    Science.gov (United States)

    Geraci, Joseph; Lidar, Daniel A.

    2010-07-01

    We exploit a recently constructed mapping between quantum circuits and graphs in order to prove that circuits corresponding to certain planar graphs can be efficiently simulated classically. The proof uses an expression for the Ising model partition function in terms of quadratically signed weight enumerators (QWGTs), which are polynomials that arise naturally in an expansion of quantum circuits in terms of rotations involving Pauli matrices. We combine this expression with a known efficient classical algorithm for the Ising partition function of any planar graph in the absence of an external magnetic field, and the Robertson-Seymour theorem from graph theory. We give as an example a set of quantum circuits with a small number of non-nearest-neighbor gates which admit an efficient classical simulation.

  3. Boron-10 ABUNCL Models of Fuel Testing

    Energy Technology Data Exchange (ETDEWEB)

    Siciliano, Edward R.; Lintereur, Azaree T.; Kouzes, Richard T.; Ely, James H.

    2013-10-01

    The Department of Energy Office of Nuclear Safeguards and Security (NA-241) is supporting the project Coincidence Counting With Boron-Based Alternative Neutron Detection Technology at Pacific Northwest National Laboratory (PNNL) for the development of a 3He proportional counter alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a system based upon 10B-lined proportional tubes in a configuration typical for 3He-based coincidence counter applications. This report provides results from MCNP simulations of the General Electric Reuter-Stokes Alternative Boron-Based Uranium Neutron Coincidence Collar (ABUNCL) active configuration model with fuel pins previously measured at Los Alamos National Laboratory. A comparison of the GE-ABUNCL simulations and simulations of 3He based UNCL-II active counter (the system for which the GE-ABUNCL was targeted to replace) with the same fuel pin assemblies is also provided.

  4. Precision electroweak tests of the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Renton, Peter B. [Denys Wilkinson Building, Oxford (United Kingdom)]. E-mail: p.renton1@physics.ox.ac.uk

    2002-09-01

    The present status of precision electroweak data is reviewed. These data include measurements of e{sup +}e{sup -}{yields}f-barf, taken at the Z resonance at LEP, which are used to determine the mass and width of the Z-boson. In addition, measurements have also been made of the forward-backward asymmetries for leptons and heavy-quarks, and also the final state polarization of the {tau}-lepton. At SLAC, where the electron beam was polarized, measurements were made of the left-right polarized asymmetry, A{sub LR}, and the left-right forward-backward asymmetries for b- and c-quarks. The mass, m{sub W}, and width, {gamma}{sub W}, of the W-boson have been measured at the Tevatron and at LEP, and the mass of the top-quark, m{sub t}, has been measured at the Tevatron. These data, plus other electroweak data, are used in global electroweak fits in which various Standard Model (SM) parameters are determined. A comparison is made between the results of the direct measurements of m{sub W} and m{sub t} with the indirect results coming from electroweak radiative corrections. Using all precision electroweak data, fits are also made to determine limits on the mass of the Higgs boson, m{sub H}. The influence on these limits of specific measurements, particularly those which are somewhat inconsistent with the SM, is explored. The data are also analysed in terms of the quasi-model-independent {epsilon} variables. Finally, the impact on the electroweak fits of the improvements in the determination of the W-boson and top-quark masses, expected from the Tevatron Run 2, is examined. (author)

  5. Testing for a Threshold in Models with Endogenous Regressors

    NARCIS (Netherlands)

    Rothfelder, Mario; Boldea, Otilia

    2016-01-01

    Using 2SLS estimation, we propose two tests for a threshold in models with endogenous regressors: a sup LR test and a sup Wald test. Here, the 2SLS estimation is not conventional because it uses additional information about the first-stage being linear or not. Because of this additional information,

  6. Development of the GPM Observatory Thermal Vacuum Test Model

    Science.gov (United States)

    Yang, Kan; Peabody, Hume

    2012-01-01

    A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.

  7. Hidden Markov Model Based Automated Fault Localization for Integration Testing

    OpenAIRE

    Ge, Ning; NAKAJIMA, SHIN; Pantel, Marc

    2013-01-01

    International audience; Integration testing is an expensive activity in software testing, especially for fault localization in complex systems. Model-based diagnosis (MBD) provides various benefits in terms of scalability and robustness. In this work, we propose a novel MBD approach for the automated fault localization in integration testing. Our method is based on Hidden Markov Model (HMM) which is an abstraction of system's component to simulate component's behaviour. The core of this metho...

  8. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  9. Complementary cosmological tests of RSII brane models

    CERN Document Server

    Holanda, R F L; Dahia, F

    2013-01-01

    In this paper we explore observational bounds on flat and non-flat cosmological models in Type II Randall-Sundrum (RSII) branes. In a first analysis, we consider current measurements of the expansion rate H(z) (with two priors on the local Hubble parameter) and 288 Type Ia supernovae from the Sloan Digital Sky Survey (within the framework of the mlcs2k2 light-curve fitting method). We find that the joint analysis involving these data is an interesting tool to impose limits on the brane tension density parameter (Omega_{lambda}) and that the spatial curvature has a negligible influence on Omega_{lambda} estimates. In order to obtain stronger bounds for the contribution of the $\\Omega_{\\lambda}$ we also add in our analysis the baryon oscillation peak (BAO) and cosmic microwave background radiation (CMB) observations by using the so-called CMB/BAO ratio. From this analysis we find that the Omega_{lambda} contribution is less than 4.10^{-5} (1sigma).

  10. Economic contract theory tests models of mutualism.

    Science.gov (United States)

    Weyl, E Glen; Frederickson, Megan E; Yu, Douglas W; Pierce, Naomi E

    2010-09-01

    Although mutualisms are common in all ecological communities and have played key roles in the diversification of life, our current understanding of the evolution of cooperation applies mostly to social behavior within a species. A central question is whether mutualisms persist because hosts have evolved costly punishment of cheaters. Here, we use the economic theory of employment contracts to formulate and distinguish between two mechanisms that have been proposed to prevent cheating in host-symbiont mutualisms, partner fidelity feedback (PFF) and host sanctions (HS). Under PFF, positive feedback between host fitness and symbiont fitness is sufficient to prevent cheating; in contrast, HS posits the necessity of costly punishment to maintain mutualism. A coevolutionary model of mutualism finds that HS are unlikely to evolve de novo, and published data on legume-rhizobia and yucca-moth mutualisms are consistent with PFF and not with HS. Thus, in systems considered to be textbook cases of HS, we find poor support for the theory that hosts have evolved to punish cheating symbionts; instead, we show that even horizontally transmitted mutualisms can be stabilized via PFF. PFF theory may place previously underappreciated constraints on the evolution of mutualism and explain why punishment is far from ubiquitous in nature.

  11. Glide back booster wind tunnel model testing

    Science.gov (United States)

    Pricop, M. V.; Cojocaru, M. G.; Stoica, C. I.; Niculescu, M. L.; Neculaescu, A. M.; Persinaru, A. G.; Boscoianu, M.

    2017-07-01

    Affordable space access requires partial or ideally full launch vehicle reuse, which is in line with clean environment requirement. Although the idea is old, the practical use is difficult, requiring very large technology investment for qualification. Rocket gliders like Space Shuttle have been successfullyoperated but the price and correspondingly the energy footprint were found not sustainable. For medium launchers, finally there is a very promising platform as Falcon 9. For very small launchers the situation is more complex, because the performance index (payload to start mass) is already small, versus medium and heavy launchers. For partial reusable micro launchers this index is even smaller. However the challenge has to be taken because it is likely that in a multiyear effort, technology is going to enable the performance recovery to make such a system economically and environmentally feasible. The current paper is devoted to a small unitary glide back booster which is foreseen to be assembled in a number of possible configurations. Although the level of analysis is not deep, the solution is analyzed from the aerodynamic point of view. A wind tunnel model is designed, with an active canard, to enablea more efficient wind tunnel campaign, as a national level premiere.

  12. HOW TO BUILD TESTS IN THE IMITATION MODEL FOR TEST-BASED KNOWLEDGE CONTROL

    Directory of Open Access Journals (Sweden)

    Oleksandr M. Aleksieiev

    2011-02-01

    Full Text Available The principles of imitation model for knowledge test control, specifically on the perfection of the procedure of constructing a test, are developed in this model. The authors suggest taking into account the difficulty of the question when one is making a decision about including a given test question into the test. They also suggest using iterational calculations in order to create a test with the help of optimization algorithms that are used for the process of random searching. The sum of task complexity indices for such test will meet the criteria of joint value and difficulty. Detailed explanation of mathematical apparatus that is used for decision-making during the test-building process is given in the article as well as an example that demonstrates main steps of iterational calculations and mechanisms for achieving optimal test structure based on the criteria of joint value and difficulty.

  13. Bankruptcy risk model and empirical tests.

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M; Urosevic, Branko; Stanley, H Eugene

    2010-10-26

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor--the debt-to-asset ratio R--in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes's theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees--although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers.

  14. Matrix diffusion model. In situ tests using natural analogues

    Energy Technology Data Exchange (ETDEWEB)

    Rasilainen, K. [VTT Energy, Espoo (Finland)

    1997-11-01

    Matrix diffusion is an important retarding and dispersing mechanism for substances carried by groundwater in fractured bedrock. Natural analogues provide, unlike laboratory or field experiments, a possibility to test the model of matrix diffusion in situ over long periods of time. This thesis documents quantitative model tests against in situ observations, done to support modelling of matrix diffusion in performance assessments of nuclear waste repositories. 98 refs. The thesis includes also eight previous publications by author.

  15. Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbek, Anders

    In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... symmetric non-linear error correction are considered. A simulation study shows that the finite sample properties of the bootstrapped tests are satisfactory with good size and power properties for reasonable sample sizes....

  16. Modal test and analysis: Multiple tests concept for improved validation of large space structure mathematical models

    Science.gov (United States)

    Wada, B. K.; Kuo, C-P.; Glaser, R. J.

    1986-01-01

    For the structural dynamic analysis of large space structures, the technology in structural synthesis and the development of structural analysis software have increased the capability to predict the dynamic characteristics of the structural system. The various subsystems which comprise the system are represented by various displacement functions; the displacement functions are then combined to represent the total structure. Experience has indicated that even when subsystem mathematical models are verified by test, the mathematical representations of the total system are often in error because the mathematical model of the structural elements which are significant when loads are applied at the interconnection points are not adequately verified by test. A multiple test concept, based upon the Multiple Boundary Condition Test (MBCT), is presented which will increase the accuracy of the system mathematical model by improving the subsystem test and test/analysis correlation procedure.

  17. A lognormal model for response times on test items

    NARCIS (Netherlands)

    van der Linden, Willem J.

    2006-01-01

    A lognormal model for the response times of a person on a set of test items is investigated. The model has a parameter structure analogous to the two-parameter logistic response models in item response theory, with a parameter for the speed of each person as well as parameters for the time intensity

  18. ASSESSING THE QUALITY OF TESTS IN SPAIN: REVISION OF THE SPANISH TEST REVIEW MODEL

    Directory of Open Access Journals (Sweden)

    Ana Hernández

    2016-09-01

    Full Text Available In order for practitioners to be able to use tests appropriately, they must have rigorous information on the quality of the tests. This is why the Spanish test review model (Prieto & Muñiz, 2000 has been applied for a number of years. The goal of this paper is to update and revise this model in order to incorporate the recommendations that have been provided since the original model was applied and to incorporate the latest psychometric and technological innovations. The original model was revised following a series of steps, and the revised proposal was reviewed by a number of experts. After incorporating their suggestions, we have arrived at the final version, which is described in this paper. With the application of the revised model, and the publication of the corresponding results, we hope to continue to improve the use of tests, and consequently, the professional practice of psychology.

  19. Using Virtual ATE Model to Migrate Test Programs

    Institute of Scientific and Technical Information of China (English)

    王晓明; 杨乔林

    1995-01-01

    Bacause of high development costs of IC (Integrated Circuit)test programs,recycling existing test programs from one kind of ATE (Automatic Test Equipment) to another or generating directly from CAD simulation modules to ATE is more and more valuable.In this paper,a new approach to migrating test programs is presented.A virtual ATE model based on object-oriented paradigm is developed;it runs Test C++ (an intermediate test control language) programs and TeIF(Test Inftermediate Format-an intermediate pattern),migrates test programs among three kinds of ATE (Ando DIC8032,Schlumberger S15 and GenRad 1732) and generates test patterns from two kinds of CAD 9Daisy and Panda) automatically.

  20. Model tests on a semi-axial pump turbine

    Energy Technology Data Exchange (ETDEWEB)

    Strohmer, F.; Horacek, G.

    1984-03-01

    Due to their good hydraulic characteristic semi-axial pump turbines are used in the medium head range of pumped storage plants. This paper describes model tests performed on a semiaxial pump turbine model and shows the results of these tests. The aim of the model tests was the optimization of the hydraulic water passage, the measurement of the hydraulic characteristics over the whole operating range, the investigation of the cavitation behaviour, the investigation of the hydraulic forces and torques as well as the proof of the values guaranteed to the customer.

  1. A tutorial on testing the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias; Minakata, Katsumi

    2016-01-01

    , to faster responses to redundant signals. In contrast, coactivation models assume integrated processing of the combined stimuli. To distinguish between these two accounts, Miller (1982) derived the well-known race model inequality, which has become a routine test for behavioral data in experiments...... with redundant signals. In this tutorial, we review the basic properties of redundant signals experiments and current statistical procedures used to test the race model inequality during the period between 2011 and 2014. We highlight and discuss several issues concerning study design and the test of the race...

  2. A new fit-for-purpose model testing framework: Decision Crash Tests

    Science.gov (United States)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  3. Assessment of Galileo modal test results for mathematical model verification

    Science.gov (United States)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  4. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  5. Testing models of triggered star formation: theory and observation

    CERN Document Server

    Haworth, Thomas J; Acreman, David M

    2012-01-01

    One of the main reasons that triggered star formation is contentious is the failure to accurately link the observations with models in a detailed, quantitative, way. It is therefore critical to continuously test and improve the model details and methods with which comparisons to observations are made. We use a Monte Carlo radiation transport and hydrodynamics code TORUS to show that the diffuse radiation field has a significant impact on the outcome of radiatively driven implosion (RDI) models. We also calculate SEDs and synthetic images from the models to test observational diagnostics that are used to determine bright rimmed cloud conditions and search for signs of RDI.

  6. THE MISHKIN TEST: AN ANALYSIS OF MODEL EXTENSIONS

    Directory of Open Access Journals (Sweden)

    Diana MURESAN

    2015-04-01

    Full Text Available This paper reviews empirical research that apply Mishkin test for the examination of the existence of accruals anomaly using alternative approaches. Mishkin test is a test used in macro-econometrics for rational hypothesis, which test for the market efficiency. Starting with Sloan (1996 the model has been applied to accruals anomaly literature. Since Sloan (1996, the model has known various improvements and it has been the subject to many debates in the literature regarding its efficacy. Nevertheless, the current evidence strengthens the pervasiveness of the model. The analyses realized on the extended studies on Mishkin test highlights that adding additional variables enhances the results, providing insightful information about the occurrence of accruals anomaly.

  7. A permutation test for the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias

    2010-01-01

    signals. Several statistical procedures have been used for testing the race model inequality. However, the commonly employed procedure does not control the Type I error. In this article a permutation test is described that keeps the Type I error at the desired level. Simulations show that the power...... of such experiments is whether the observed redundancy gains can be explained by parallel processing of the two stimuli in a race-like fashion. To test the parallel processing model, Miller derived the well-known race model inequality which has become a routine test for behavioral data in experiments with redundant...... of the test is reasonable even for small samples. The scripts discussed in this article may be downloaded as supplemental materials from http://brm.psychonomic-journals.org/content/supplemental....

  8. THE MISHKIN TEST: AN ANALYSIS OF MODEL EXTENSIONS

    Directory of Open Access Journals (Sweden)

    Diana MURESAN

    2015-04-01

    Full Text Available This paper reviews empirical research that apply Mishkin test for the examination of the existence of accruals anomaly using alternative approaches. Mishkin test is a test used in macro-econometrics for rational hypothesis, which test for the market efficiency. Starting with Sloan (1996 the model has been applied to accruals anomaly literature. Since Sloan (1996, the model has known various improvements and it has been the subject to many debates in the literature regarding its efficacy. Nevertheless, the current evidence strengthens the pervasiveness of the model. The analyses realized on the extended studies on Mishkin test highlights that adding additional variables enhances the results, providing insightful information about the occurrence of accruals anomaly.

  9. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  10. Generalized F test and generalized deviance test in two-way ANOVA models for randomized trials.

    Science.gov (United States)

    Shen, Juan; He, Xuming

    2014-01-01

    We consider the problem of detecting treatment effects in a randomized trial in the presence of an additional covariate. By reexpressing a two-way analysis of variance (ANOVA) model in a logistic regression framework, we derive generalized F tests and generalized deviance tests, which provide better power in detecting common location-scale changes of treatment outcomes than the classical F test. The null distributions of the test statistics are independent of the nuisance parameters in the models, so the critical values can be easily determined by Monte Carlo methods. We use simulation studies to demonstrate how the proposed tests perform compared with the classical F test. We also use data from a clinical study to illustrate possible savings in sample sizes.

  11. Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbek, Anders

    In this paper, we consider a general class of vector error correction models which allow for asymmetric and non-linear error correction. We provide asymptotic results for (quasi-)maximum likelihood (QML) based estimators and tests. General hypothesis testing is considered, where testing...... for linearity is of particular interest as parameters of non-linear components vanish under the null. To solve the latter type of testing, we use the so-called sup tests, which here requires development of new (uniform) weak convergence results. These results are potentially useful in general for analysis...... of non-stationary non-linear time series models. Thus the paper provides a full asymptotic theory for estimators as well as standard and non-standard test statistics. The derived asymptotic results prove to be new compared to results found elsewhere in the literature due to the impact of the estimated...

  12. Modeling Student Test-Taking Motivation in the Context of an Adaptive Achievement Test

    Science.gov (United States)

    Wise, Steven L.; Kingsbury, G. Gage

    2016-01-01

    This study examined the utility of response time-based analyses in understanding the behavior of unmotivated test takers. For the data from an adaptive achievement test, patterns of observed rapid-guessing behavior and item response accuracy were compared to the behavior expected under several types of models that have been proposed to represent…

  13. Testing for Equivalence: A Methodology for Computational Cognitive Modelling

    Science.gov (United States)

    Stewart, Terrence; West, Robert

    2010-12-01

    The equivalence test (Stewart and West, 2007; Stewart, 2007) is a statistical measure for evaluating the similarity between a model and the system being modelled. It is designed to avoid over-fitting and to generate an easily interpretable summary of the quality of a model. We apply the equivalence test to two tasks: Repeated Binary Choice (Erev et al., 2010) and Dynamic Stocks and Flows (Gonzalez and Dutt, 2007). In the first case, we find a broad range of statistically equivalent models (and win a prediction competition) while identifying particular aspects of the task that are not yet adequately captured. In the second case, we re-evaluate results from the Dynamic Stocks and Flows challenge, demonstrating how our method emphasizes the breadth of coverage of a model and how it can be used for comparing different models. We argue that the explanatory power of models hinges on numerical similarity to empirical data over a broad set of measures.

  14. Towards a pragmatic human migraine model for drug testing

    DEFF Research Database (Denmark)

    Hansen, Emma Katrine; Olesen, Jes

    2017-01-01

    BACKGROUND: A model for the testing of novel anti-migraine drugs should preferably use healthy volunteers for ease of recruiting. Isosorbide-5-mononitrate (5-ISMN) provokes headache in healthy volunteers with some migraine features such as pulsating pain quality and aggravation by physical activity...... drug testing....

  15. 1g Model Tests with Foundations in Sand

    DEFF Research Database (Denmark)

    Krabbenhøft, Sven; Damkilde, Lars; Clausen, Johan

    2010-01-01

    This paper presents the results of a series 1g model tests with both a circular and a strip foundation on dense sand. The test results have been compared with the results from finite element calculations based on a non linear Mohr-Coulomb yield criterion taking into account the dependence...

  16. A Human Capital Model of Educational Test Scores

    DEFF Research Database (Denmark)

    McIntosh, James; D. Munk, Martin

    Latent class Poisson count models are used to analyze a sample of Danish test score results from a cohort of individuals born in 1954-55 and tested in 1968. The procedure takes account of unobservable effects as well as excessive zeros in the data. The bulk of unobservable effects are uncorrelate...

  17. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  18. A permutation test for the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias

    2010-01-01

    signals. Several statistical procedures have been used for testing the race model inequality. However, the commonly employed procedure does not control the Type I error. In this article a permutation test is described that keeps the Type I error at the desired level. Simulations show that the power...

  19. Testing for causality in variance using multivariate GARCH models

    NARCIS (Netherlands)

    C.M. Hafner (Christian); H. Herwartz

    2004-01-01

    textabstractTests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently little is known about their power properties. In this paper we show that a convenient alternative to residual

  20. Horns Rev II, 2D-Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Brorsen, Michael

    This report is an extension of the study presented in Lykke Andersen and Brorsen, 2006 and includes results from the irregular wave tests, where Lykke Andersen & Brorsen, 2006 focused on regular waves. The 2D physical model tests were carried out in the shallow wave flume at Dept. of Civil...

  1. Model Driven Testing of Web Applications Using Domain Specific Language

    OpenAIRE

    Viet-Cuong Nguyen

    2015-01-01

    As more and more systems move to the cloud, the importance of web applications has increased recently. Web applications need more strict requirements in order to sup-port higher availability. The techniques in quality assurance of these applications hence become essential, the role of testing for web application becomes more significant. Model-driven testing is a promising paradigm for the automation of software testing. In the web domain, the challenge however remains in the creation of mode...

  2. Testing of a Buran flight-model fuel cell

    Science.gov (United States)

    Schautz, M.; Dudley, G.; Baron, F.; Popov, V.; Pospelov, B.

    A demonstration test program has been performed at European Space Research & Technology Center (ESTEC) on a flight-model Russian 'Photon' fuel cell. The tests, conducted at various power levels up to 23 kW, included current/voltage characteristics, transient behavior, autothermal startup, and impedance measurements. In addition, the product water and the purge gas were analyzed. All test goals were met and no electrochemical limitations were apparent.

  3. Improved animal models for testing gene therapy for atherosclerosis.

    Science.gov (United States)

    Du, Liang; Zhang, Jingwan; De Meyer, Guido R Y; Flynn, Rowan; Dichek, David A

    2014-04-01

    Gene therapy delivered to the blood vessel wall could augment current therapies for atherosclerosis, including systemic drug therapy and stenting. However, identification of clinically useful vectors and effective therapeutic transgenes remains at the preclinical stage. Identification of effective vectors and transgenes would be accelerated by availability of animal models that allow practical and expeditious testing of vessel-wall-directed gene therapy. Such models would include humanlike lesions that develop rapidly in vessels that are amenable to efficient gene delivery. Moreover, because human atherosclerosis develops in normal vessels, gene therapy that prevents atherosclerosis is most logically tested in relatively normal arteries. Similarly, gene therapy that causes atherosclerosis regression requires gene delivery to an existing lesion. Here we report development of three new rabbit models for testing vessel-wall-directed gene therapy that either prevents or reverses atherosclerosis. Carotid artery intimal lesions in these new models develop within 2-7 months after initiation of a high-fat diet and are 20-80 times larger than lesions in a model we described previously. Individual models allow generation of lesions that are relatively rich in either macrophages or smooth muscle cells, permitting testing of gene therapy strategies targeted at either cell type. Two of the models include gene delivery to essentially normal arteries and will be useful for identifying strategies that prevent lesion development. The third model generates lesions rapidly in vector-naïve animals and can be used for testing gene therapy that promotes lesion regression. These models are optimized for testing helper-dependent adenovirus (HDAd)-mediated gene therapy; however, they could be easily adapted for testing of other vectors or of different types of molecular therapies, delivered directly to the blood vessel wall. Our data also supports the promise of HDAd to deliver long

  4. Inferential permutation tests for maximum entropy models in ecology.

    Science.gov (United States)

    Shipley, Bill

    2010-09-01

    Maximum entropy (maxent) models assign probabilities to states that (1) agree with measured macroscopic constraints on attributes of the states and (2) are otherwise maximally uninformative and are thus as close as possible to a specified prior distribution. Such models have recently become popular in ecology, but classical inferential statistical tests require assumptions of independence during the allocation of entities to states that are rarely fulfilled in ecology. This paper describes a new permutation test for such maxent models that is appropriate for very general prior distributions and for cases in which many states have zero abundance and that can be used to test for conditional relevance of subsets of constraints. Simulations show that the test gives correct probability estimates under the null hypothesis. Power under the alternative hypothesis depends primarily on the number and strength of the constraints and on the number of states in the model; the number of empty states has only a small effect on power. The test is illustrated using two empirical data sets to test the community assembly model of B. Shipley, D. Vile, and E. Garnier and the species abundance distribution models of S. Pueyo, F. He, and T. Zillio.

  5. Proceedings 7th Workshop on Model-Based Testing

    CERN Document Server

    Petrenko, Alexander K; 10.4204/EPTCS.80

    2012-01-01

    This volume contains the proceedings of the Seventh Workshop on Model-Based Testing (MBT 2012), which was held on 25 March, 2012 in Tallinn, Estonia, as a satellite event of the European Joint Conferences on Theory and Practice of Software, ETAPS 2012. The workshop is devoted to model-based testing of both software and hardware. Model-based testing uses models describing the required behavior of the system under consideration to guide such efforts as test selection and test results evaluation. Testing validates the real system behavior against models and checks that the implementation conforms to them, but is capable also to find errors in the models themselves. The first MBT workshop was held in 2004, in Barcelona. At that time MBT already had become a hot topic, but the MBT workshop was the first event devoted mostly to this topic. Since that time the area has generated enormous scientific interest, and today there are several specialized workshops and more broad conferences on software and hardware design ...

  6. A tutorial on testing the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias; Minakata, Katsumi

    2016-01-01

    , to faster responses to redundant signals. In contrast, coactivation models assume integrated processing of the combined stimuli. To distinguish between these two accounts, Miller (1982) derived the well-known race model inequality, which has become a routine test for behavioral data in experiments...... with redundant signals. In this tutorial, we review the basic properties of redundant signals experiments and current statistical procedures used to test the race model inequality during the period between 2011 and 2014. We highlight and discuss several issues concerning study design and the test of the race...... model inequality, such as inappropriate control of Type I error, insufficient statistical power, wrong treatment of omitted responses or anticipations and the interpretation of violations of the race model inequality. We make detailed recommendations on the design of redundant signals experiments...

  7. Bayesian model evidence for order selection and correlation testing.

    Science.gov (United States)

    Johnston, Leigh A; Mareels, Iven M Y; Egan, Gary F

    2011-01-01

    Model selection is a critical component of data analysis procedures, and is particularly difficult for small numbers of observations such as is typical of functional MRI datasets. In this paper we derive two Bayesian evidence-based model selection procedures that exploit the existence of an analytic form for the linear Gaussian model class. Firstly, an evidence information criterion is proposed as a model order selection procedure for auto-regressive models, outperforming the commonly employed Akaike and Bayesian information criteria in simulated data. Secondly, an evidence-based method for testing change in linear correlation between datasets is proposed, which is demonstrated to outperform both the traditional statistical test of the null hypothesis of no correlation change and the likelihood ratio test.

  8. A general diagnostic model applied to language testing data.

    Science.gov (United States)

    von Davier, Matthias

    2008-11-01

    Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

  9. Testing and reference model analysis of FTTH system

    Science.gov (United States)

    Feng, Xiancheng; Cui, Wanlong; Chen, Ying

    2009-08-01

    With rapid development of Internet and broadband access network, the technologies of xDSL, FTTx+LAN , WLAN have more applications, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network.. Fiber to the Home (FTTH) will be the goal of telecommunications cable broadband access. In accordance with the development trend of telecommunication services, to enhance the capacity of integrated access network, to achieve triple-play (voice, data, image), based on the existing optical Fiber to the curb (FTTC), Fiber To The Zone (FTTZ), Fiber to the Building (FTTB) user optical cable network, the optical fiber can extend to the FTTH system of end-user by using EPON technology. The article first introduced the basic components of FTTH system; and then explain the reference model and reference point for testing of the FTTH system; Finally, by testing connection diagram, the testing process, expected results, primarily analyze SNI Interface Testing, PON interface testing, Ethernet performance testing, UNI interface testing, Ethernet functional testing, PON functional testing, equipment functional testing, telephone functional testing, operational support capability testing and so on testing of FTTH system. ...

  10. A Novel Testing Model for SOA based Services

    Directory of Open Access Journals (Sweden)

    Abhishek Kumar

    2015-01-01

    Full Text Available SOA (Service-Oriented Architecture filled the gap between software and commercial enterprise. SOA integrates multiple web services. We bear to secure the caliber of web services that gives guarantee about what network services work and their output results. There is close to work has to be performed for an automatic test case generation for SOA based services. But, full coverage of XML elements is missing. To the best of our knowledge this all works do not attempt to cover all possible elements of the XML schema presents in the WSDL file. There is also a need to apply different assertions on each service operation for generating the test cases. To overcome this problem we proposed a novel testing model for SOA based application. This new testing model helps us to get the automatic test cases of SOA based application. We build up our new test model with the aid of our proposed test case generation algorithm and test case selection algorithm. In the end, we generate the test suite execution results and find the coverage of XML schema elements present in the WSDL file.

  11. Estrogen receptor testing and 10-year mortality from breast cancer: A model for determining testing strategy

    Directory of Open Access Journals (Sweden)

    Christopher Naugler

    2012-01-01

    Full Text Available Background: The use of adjuvant tamoxifen therapy in the treatment of estrogen receptor (ER expressing breast carcinomas represents a major advance in personalized cancer treatment. Because there is no benefit (and indeed there is increased morbidity and mortality associated with the use of tamoxifen therapy in ER-negative breast cancer, its use is restricted to women with ER expressing cancers. However, correctly classifying cancers as ER positive or negative has been challenging given the high reported false negative test rates for ER expression in surgical specimens. In this paper I model practice recommendations using published information from clinical trials to address the question of whether there is a false negative test rate above which it is more efficacious to forgo ER testing and instead treat all patients with tamoxifen regardless of ER test results. Methods: I used data from randomized clinical trials to model two different hypothetical treatment strategies: (1 the current strategy of treating only ER positive women with tamoxifen and (2 an alternative strategy where all women are treated with tamoxifen regardless of ER test results. The variables used in the model are literature-derived survival rates of the different combinations of ER positivity and treatment with tamoxifen, varying true ER positivity rates and varying false negative ER testing rates. The outcome variable was hypothetical 10-year survival. Results: The model predicted that there will be a range of true ER rates and false negative test rates above which it would be more efficacious to treat all women with breast cancer with tamoxifen and forgo ER testing. This situation occurred with high true positive ER rates and false negative ER test rates in the range of 20-30%. Conclusions: It is hoped that this model will provide an example of the potential importance of diagnostic error on clinical outcomes and furthermore will give an example of how the effect of that

  12. Test Driven Development: Lessons from a Simple Scientific Model

    Science.gov (United States)

    Clune, T. L.; Kuo, K.

    2010-12-01

    In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.

  13. Crush testing, characterizing, and modeling the crashworthiness of composite laminates

    Science.gov (United States)

    Garner, David Michael, Jr.

    Research in the field of crashworthiness of composite materials is presented. A new crush test method was produced to characterize the crush behavior of composite laminates. In addition, a model of the crush behavior and a method for rank ordering the energy absorption capability of various laminates were developed. The new crush test method was used for evaluating the crush behavior of flat carbon/epoxy composite specimens at quasi-static and dynamic rates. The University of Utah crush test fixture was designed to support the flat specimen against catastrophic buckling. A gap, where the specimen is unsupported, allowed unhindered crushing of the specimen. In addition, the specimen's failure modes could be clearly observed during crush testing. Extensive crush testing was conducted wherein the crush force and displacement data were collected to calculate the energy absorption, and high speed video was captured during dynamic testing. Crush tests were also performed over a range of fixture gap heights. The basic failure modes were buckling, crack growth, and fracture. Gap height variations resulted in poorly, properly, and overly constrained specimens. In addition, guidelines for designing a composite laminate for crashworthiness were developed. Modeling of the crush behavior consisted of the delamination and fracture of a single ply or group of like plies during crushing. Delamination crack extension was modeled using the mode I energy release rate, G lc, where an elastica approach was used to obtain the strain energy. Variations in Glc were briefly explored with double cantilever beam tests wherein crack extension occurred along a multidirectional ply interface. The model correctly predicted the failure modes for most of the test cases, and offered insight into how the input parameters affect the model. The ranking method related coefficients of the laminate and sublaminate stiffness matrices, the ply locations within the laminate, and the laminate thickness. The

  14. Horns Rev II, 2D-Model Tests

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Frigaard, Peter

    This report present the results of 2D physical model tests carried out in the shallow wave flume at Dept. of Civil Engineering, Aalborg University (AAU). The objective of the tests was: To investigate the combined influence of the pile diameter to water depth ratio and the wave height to water...... on the front side of the pile (0 to 90 degrees). These tests have been conducted at Aalborg University from 9. October, 2006 to 8. November, 2006. Unless otherwise mentioned, all values given in this report are in model scale....

  15. Testing cosmological models with the Integrated Sachs-Wolfe effect

    Energy Technology Data Exchange (ETDEWEB)

    Raccanelli, Alvise, E-mail: alvise.raccanelli@port.ac.uk [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth, PO1 3FX (United Kingdom)

    2011-02-01

    The cross correlation between the Cosmic Microwave Background and the Large Scale Structure of the Universe is a powerful probe to test our cosmological models. This correlation can be used to detect the Integrated Sachs-Wolfe effect, and it depends on both the geometry of the Universe and the properties of the clustering and evolution of structures; for this reason it can be used to test and constrain cosmological models and parameters as well as theories of gravity. In this proceeding we briefly introduce the ISW effect and present some of the recent cosmological tests done using it.

  16. SEMIOTIC MODELS AND AUTOMATIZATION OF PEDAGOGICAL TESTS DESIGN

    Directory of Open Access Journals (Sweden)

    Gennady N. Zverev

    2013-01-01

    Full Text Available The paper deals with the problems of construction objective models of educational course, learning processes, control of learning results. We considered the possibility of automated test generation using formalized concepts of testology, semiotic and mathematical models of pedagogical processes. 

  17. Learning-Testing Process in Classroom: An Empirical Simulation Model

    Science.gov (United States)

    Buda, Rodolphe

    2009-01-01

    This paper presents an empirical micro-simulation model of the teaching and the testing process in the classroom (Programs and sample data are available--the actual names of pupils have been hidden). It is a non-econometric micro-simulation model describing informational behaviors of the pupils, based on the observation of the pupils'…

  18. Data Modeling for Measurements in the Metrology and Testing Fields

    CERN Document Server

    Pavese, Franco

    2009-01-01

    Offers a comprehensive set of modeling methods for data and uncertainty analysis. This work develops methods and computational tools to address general models that arise in practice, allowing for a more valid treatment of calibration and test data and providing an understanding of complex situations in measurement science

  19. Animal models for testing anti-prion drugs.

    Science.gov (United States)

    Fernández-Borges, Natalia; Elezgarai, Saioa R; Eraña, Hasier; Castilla, Joaquín

    2013-01-01

    Prion diseases belong to a group of fatal infectious diseases with no effective therapies available. Throughout the last 35 years, less than 50 different drugs have been tested in different experimental animal models without hopeful results. An important limitation when searching for new drugs is the existence of appropriate models of the disease. The three different possible origins of prion diseases require the existence of different animal models for testing anti-prion compounds. Wild type, over-expressing transgenic mice and other more sophisticated animal models have been used to evaluate a diversity of compounds which some of them were previously tested in different in vitro experimental models. The complexity of prion diseases will require more pre-screening studies, reliable sporadic (or spontaneous) animal models and accurate chemical modifications of the selected compounds before having an effective therapy against human prion diseases. This review is intended to put on display the more relevant animal models that have been used in the search of new antiprion therapies and describe some possible procedures when handling chemical compounds presumed to have anti-prion activity prior to testing them in animal models.

  20. Validating induced seismicity forecast models - Induced Seismicity Test Bench

    CERN Document Server

    Kiraly-Proag, Eszter; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph

    2016-01-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-For\\^ets 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in, but is only mediocre at forecasting the spatial distri...

  1. Direct cointegration testing in error-correction models

    NARCIS (Netherlands)

    F.R. Kleibergen (Frank); H.K. van Dijk (Herman)

    1994-01-01

    textabstractAbstract An error correction model is specified having only exact identified parameters, some of which reflect a possible departure from a cointegration model. Wald, likelihood ratio, and Lagrange multiplier statistics are derived to test for the significance of these parameters. The con

  2. The Nominal Response Model in Computerized Adaptive Testing.

    Science.gov (United States)

    De Ayala, R. J.

    One important and promising application of item response theory (IRT) is computerized adaptive testing (CAT). The implementation of a nominal response model-based CAT (NRCAT) was studied. Item pool characteristics for the NRCAT as well as the comparative performance of the NRCAT and a CAT based on the three-parameter logistic (3PL) model were…

  3. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde;

    2010-01-01

    Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies....

  4. A Negative Binomial Regression Model for Accuracy Tests

    Science.gov (United States)

    Hung, Lai-Fa

    2012-01-01

    Rasch used a Poisson model to analyze errors and speed in reading tests. An important property of the Poisson distribution is that the mean and variance are equal. However, in social science research, it is very common for the variance to be greater than the mean (i.e., the data are overdispersed). This study embeds the Rasch model within an…

  5. Micromechanical model of the single fiber fragmentation test

    DEFF Research Database (Denmark)

    Sørensen, Bent F.

    2017-01-01

    A shear-lag model is developed for the analysis of single fiber fragmentation tests for the characterization of the mechanical properties of the fiber/matrix interface in composite materials. The model utilizes the relation for the loss in potential energy of Budiansky, Hutchinson and Evans...

  6. Premarital Contraceptive Use: A Test of Two Models

    Science.gov (United States)

    Delamater, John; Maccorquodale, Patricia

    1978-01-01

    Tests the utility of two models for explaining contraceptive use by sexually active women (N=391). Significant relationships were found between use and permissive premarital standards and standard-behavior consistency. Neither model is particularly applicable to the contraceptive reports of sexually active males (N=354). (Author)

  7. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  8. Modeling cross-hole slug tests in an unconfined aquifer

    Science.gov (United States)

    Malama, Bwalya; Kuhlman, Kristopher L.; Brauchler, Ralf; Bayer, Peter

    2016-09-01

    A modified version of a published slug test model for unconfined aquifers is applied to cross-hole slug test data collected in field tests conducted at the Widen site in Switzerland. The model accounts for water-table effects using the linearized kinematic condition. The model also accounts for inertial effects in source and observation wells. The primary objective of this work is to demonstrate applicability of this semi-analytical model to multi-well and multi-level pneumatic slug tests. The pneumatic perturbation was applied at discrete intervals in a source well and monitored at discrete vertical intervals in observation wells. The source and observation well pairs were separated by distances of up to 4 m. The analysis yielded vertical profiles of hydraulic conductivity, specific storage, and specific yield at observation well locations. The hydraulic parameter estimates are compared to results from prior pumping and single-well slug tests conducted at the site, as well as to estimates from particle size analyses of sediment collected from boreholes during well installation. The results are in general agreement with results from prior tests and are indicative of a sand and gravel aquifer. Sensitivity analysis show that model identification of specific yield is strongest at late-time. However, the usefulness of late-time data is limited due to the low signal-to-noise ratios.

  9. Tests and models of nociception and pain in rodents.

    Science.gov (United States)

    Barrot, M

    2012-06-01

    Nociception and pain is a large field of both neuroscience and medical research. Over time, various tests and models were developed in rodents to provide tools for fundamental and translational research on the topic. Tests using thermal, mechanical, and chemical stimuli, measures of hyperalgesia and allodynia, models of inflammatory or neuropathic pain, constitute a toolbox available to researchers. These tests and models allowed rapid progress on the anatomo-molecular basis of physiological and pathological pain, even though they have yet to translate into new analgesic drugs. More recently, a growing effort has been put forth trying to assess pain in rats or mice, rather than nociceptive reflexes, or at studying complex states affected by chronic pain. This aids to further improve the translational value of preclinical research in a field with balanced research efforts between fundamental research, preclinical work, and human studies. This review describes classical tests and models of nociception and pain in rodents. It also presents some recent and ongoing developments in nociceptive tests, recent trends for pain evaluation, and raises the question of the appropriateness between tests, models, and procedures.

  10. Rasch modeling of accuracy and confidence measures from cognitive tests.

    Science.gov (United States)

    Paek, Insu; Lee, Jihyun; Stankov, Lazar; Wilson, Mark

    2013-01-01

    The use of IRT models has not been rigorously applied in studies of the relationship between test-takers' confidence and accuracy. This study applied the Rasch measurement models to investigate the relationship between test-takers' confidence and accuracy on English proficiency tests, proposing potentially useful measures of under or overconfidence. The Rasch approach provided the scaffolding to formulate indices that can assess the discrepancy between confidence and accuracy at the item or total test level, as well as at particular ability levels locally. In addition, a "disattenuated" measure of association between accuracy and confidence, which takes measurement error into account, was obtained through a multidimensional Rasch modeling of the two constructs where the latent variance-covariance structure is directly estimated from the data. The results indicate that the participants tend to show overconfidence bias in their own cognitive abilities.

  11. A permutation test for the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias

    2010-01-01

    When participants are asked to respond in the same way to several stimulus identities, responses are often observed to be faster if two stimuli are presented simultaneously as opposed to when a single stimulus is presented (redundant signals effect; Miller, 1982). An important issue...... of such experiments is whether the observed redundancy gains can be explained by parallel processing of the two stimuli in a race-like fashion. To test the parallel processing model, Miller derived the well-known race model inequality which has become a routine test for behavioral data in experiments with redundant...... signals. Several statistical procedures have been used for testing the race model inequality. However, the commonly employed procedure does not control the Type I error. In this article a permutation test is described that keeps the Type I error at the desired level. Simulations show that the power...

  12. Reaction times to weak test lights. [psychophysics biological model

    Science.gov (United States)

    Wandell, B. A.; Ahumada, P.; Welsh, D.

    1984-01-01

    Maloney and Wandell (1984) describe a model of the response of a single visual channel to weak test lights. The initial channel response is a linearly filtered version of the stimulus. The filter output is randomly sampled over time. Each time a sample occurs there is some probability increasing with the magnitude of the sampled response - that a discrete detection event is generated. Maloney and Wandell derive the statistics of the detection events. In this paper a test is conducted of the hypothesis that the reaction time responses to the presence of a weak test light are initiated at the first detection event. This makes it possible to extend the application of the model to lights that are slightly above threshold, but still within the linear operating range of the visual system. A parameter-free prediction of the model proposed by Maloney and Wandell for lights detected by this statistic is tested. The data are in agreement with the prediction.

  13. Animal models of toxicology testing: the role of pigs.

    Science.gov (United States)

    Helke, Kristi L; Swindle, Marvin Michael

    2013-02-01

    In regulatory toxicological testing, both a rodent and non-rodent species are required. Historically, dogs and non-human primates (NHP) have been the species of choice of the non-rodent portion of testing. The pig is an appropriate option for these tests based on metabolic pathways utilized in xenobiotic biotransformation. This review focuses on the Phase I and Phase II biotransformation pathways in humans and pigs and highlights the similarities and differences of these models. This is a growing field and references are sparse. Numerous breeds of pigs are discussed along with specific breed differences in these enzymes that are known. While much available data are presented, it is grossly incomplete and sometimes contradictory based on methods used. There is no ideal species to use in toxicology. The use of dogs and NHP in xenobiotic testing continues to be the norm. Pigs present a viable and perhaps more reliable model of non-rodent testing.

  14. The Wave Dragon: tests on a modified model

    Energy Technology Data Exchange (ETDEWEB)

    Martinelli, Luca; Frigaard, Peter

    1999-09-01

    A modified floating model of the Wave Dragon was tested for movements, overtopping and forces on critical positions. The modifications and consequent testing of the model are part of a R and D programme. 18 tests (repetitions included) were carried out during May 1999. Forces in 7 different positions and movements for three degrees of freedom (heave, pitch and surge) were recorded for 7 wave situations. Total overtopping was measured for 5 different wave situations. Furthermore influence of crest freeboard was tested. Sensitivity to the energy spreading in multidirectional seas was investigated. A typical exponential equation describing overtopping was fitted to the data in case of frequent wave conditions. The formula is compared to the present tests. (au)

  15. Model of ASTM Flammability Test in Microgravity: Iron Rods

    Science.gov (United States)

    Steinberg, Theodore A; Stoltzfus, Joel M.; Fries, Joseph (Technical Monitor)

    2000-01-01

    There is extensive qualitative results from burning metallic materials in a NASA/ASTM flammability test system in normal gravity. However, this data was shown to be inconclusive for applications involving oxygen-enriched atmospheres under microgravity conditions by conducting tests using the 2.2-second Lewis Research Center (LeRC) Drop Tower. Data from neither type of test has been reduced to fundamental kinetic and dynamic systems parameters. This paper reports the initial model analysis for burning iron rods under microgravity conditions using data obtained at the LERC tower and modeling the burning system after ignition. Under the conditions of the test the burning mass regresses up the rod to be detached upon deceleration at the end of the drop. The model describes the burning system as a semi-batch, well-mixed reactor with product accumulation only. This model is consistent with the 2.0-second duration of the test. Transient temperature and pressure measurements are made on the chamber volume. The rod solid-liquid interface melting rate is obtained from film records. The model consists of a set of 17 non-linear, first-order differential equations which are solved using MATLAB. This analysis confirms that a first-order rate, in oxygen concentration, is consistent for the iron-oxygen kinetic reaction. An apparent activation energy of 246.8 kJ/mol is consistent for this model.

  16. Penetration Testing Professional Ethics: a conceptual model and taxonomy

    Directory of Open Access Journals (Sweden)

    Justin Pierce

    2006-05-01

    Full Text Available In an environment where commercial software is continually patched to correct security flaws, penetration testing can provide organisations with a realistic assessment of their security posture. Penetration testing uses the same principles as criminal hackers to penetrate corporate networks and thereby verify the presence of software vulnerabilities. Network administrators can use the results of a penetration test to correct flaws and improve overall security. The use of hacking techniques, however, raises several ethical questions that centre on the integrity of the tester to maintain professional distance and uphold the profession. This paper discusses the ethics of penetration testing and presents our conceptual model and revised taxonomy.

  17. Preliminary results of steel containment vessel model test

    Energy Technology Data Exchange (ETDEWEB)

    Luk, V.K.; Hessheimer, M.F. [Sandia National Labs., Albuquerque, NM (United States); Matsumoto, T.; Komine, K.; Arai, S. [Nuclear Power Engineering Corp., Tokyo (Japan); Costello, J.F. [Nuclear Regulatory Commission, Washington, DC (United States)

    1998-04-01

    A high pressure test of a mixed-scaled model (1:10 in geometry and 1:4 in shell thickness) of a steel containment vessel (SCV), representing an improved boiling water reactor (BWR) Mark II containment, was conducted on December 11--12, 1996 at Sandia National Laboratories. This paper describes the preliminary results of the high pressure test. In addition, the preliminary post-test measurement data and the preliminary comparison of test data with pretest analysis predictions are also presented.

  18. Testing the Dipole and Quadrupole Moments of Galactic Models

    OpenAIRE

    Briggs, Michael S.; Paciesas, William S.; Pendleton, Geoffrey N.; Meegan, Charles A.; Fishman, Gerald J.; Horack, John M.; Kouveliotou, Chryssa; Hartmann, Dieter H.; Hakkila, Jon

    1996-01-01

    If gamma-ray bursts originate in the Galaxy, at some level there should be a galactic pattern in their distribution on the sky. We test published galactic models by comparing their dipole and quadrupole moments with the moments of the BATSE 3B catalog. While many models have moments that are too large, several models are in acceptable or good agreement with the data.

  19. Initialization and Setup of the Coastal Model Test Bed: STWAVE

    Science.gov (United States)

    2017-01-01

    coastal numerical models . Pertinent data types, including waves, water levels, nearshore currents, bathymetry, and meteorological measurements, are...correlation coefficients, and other statistics can be calculated between the observed data and the model output for any duration of time using the...ERDC/CHL CHETN-I-93 January 2017 Approved for public release; distribution is unlimited. Initialization and Setup of the Coastal Model Test Bed

  20. Impaired reality testing in an animal model of schizophrenia.

    Science.gov (United States)

    McDannald, Michael A; Whitt, Joshua P; Calhoon, Gwendolyn G; Piantadosi, Patrick T; Karlsson, Rose-Marie; O'Donnell, Patricio; Schoenbaum, Geoffrey

    2011-12-15

    Schizophrenia is a chronic and devastating brain disorder characterized by hallucinations and delusions, symptoms reflecting impaired reality testing. Although animal models have captured negative symptoms and cognitive deficits associated with schizophrenia, none have addressed these defining, positive symptoms. Here we tested the performance of adults given neonatal ventral hippocampal lesions (NVHL), a neurodevelopmental model of schizophrenia, in two taste aversion procedures. Normal and NVHL rats formed aversions to a palatable food when the food was directly paired with nausea, but only NVHL rats formed a food aversion when the cue predicting that food was paired with nausea. The failure of NVHL rats to discriminate fully real from imagined food parallels the failure of people with schizophrenia to differentiate internal thoughts and beliefs from reality. These results further validate the NVHL model of schizophrenia and provide a means to assess impaired reality testing in variety of animal models. 2011 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. Old star clusters: Bench tests of low mass stellar models

    Directory of Open Access Journals (Sweden)

    Salaris M.

    2013-03-01

    Full Text Available Old star clusters in the Milky Way and external galaxies have been (and still are traditionally used to constrain the age of the universe and the timescales of galaxy formation. A parallel avenue of old star cluster research considers these objects as bench tests of low-mass stellar models. This short review will highlight some recent tests of stellar evolution models that make use of photometric and spectroscopic observations of resolved old star clusters. In some cases these tests have pointed to additional physical processes efficient in low-mass stars, that are not routinely included in model computations. Moreover, recent results from the Kepler mission about the old open cluster NGC6791 are adding new tight constraints to the models.

  2. A tutorial on testing the race model inequality

    DEFF Research Database (Denmark)

    Gondan, Matthias; Minakata, Katsumi

    2016-01-01

    effect. Several models have been proposed to explain this effect, including race models and coactivation models of information processing. In race models, the two stimulus components are processed in separate channels and the faster channel determines the processing time. This mechanism leads, on average......, to faster responses to redundant signals. In contrast, coactivation models assume integrated processing of the combined stimuli. To distinguish between these two accounts, Miller (1982) derived the well-known race model inequality, which has become a routine test for behavioral data in experiments...... model inequality, such as inappropriate control of Type I error, insufficient statistical power, wrong treatment of omitted responses or anticipations and the interpretation of violations of the race model inequality. We make detailed recommendations on the design of redundant signals experiments...

  3. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  4. Aeroservoelastic Testing of Free Flying Wind Tunnel Models Part 2: A Centerline Supported Fullspan Model Tested for Gust Load Alleviation

    Science.gov (United States)

    Scott, Robert C.; Vetter, Travis K.; Penning, Kevin B.; Coulson, David A.; Heeg, Jennifer

    2014-01-01

    This is part 2 of a two part document. Part 1 is titled: "Aeroservoelastic Testing of Free Flying Wind Tunnel Models Part 1: A Sidewall Supported Semispan Model Tested for Gust Load Alleviation and Flutter Suppression." A team comprised of the Air Force Research Laboratory (AFRL), Boeing, and the NASA Langley Research Center conducted three aeroservoelastic wind tunnel tests in the Transonic Dynamics Tunnel to demonstrate active control technologies relevant to large, flexible vehicles. In the first of these three tests, a full-span, aeroelastically scaled, wind tunnel model of a joined wing SensorCraft vehicle was mounted to a force balance to acquire a basic aerodynamic data set. In the second and third tests, the same wind tunnel model was mated to a new, two degree of freedom, beam mount. This mount allowed the full-span model to translate vertically and pitch. Trimmed flight at10 percent static margin and gust load alleviation were successfully demonstrated. The rigid body degrees of freedom required that the model be flown in the wind tunnel using an active control system. This risky mode of testing necessitated that a model arrestment system be integrated into the new mount. The safe and successful completion of these free-flying tests required the development and integration of custom hardware and software. This paper describes the many systems, software, and procedures that were developed as part of this effort. The balance and free flying wind tunnel tests will be summarized. The design of the trim and gust load alleviation control laws along with the associated results will also be discussed.

  5. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  6. SPSS and SAS programming for the testing of mediation models.

    Science.gov (United States)

    Dudley, William N; Benuzillo, Jose G; Carrico, Mineh S

    2004-01-01

    Mediation modeling can explain the nature of the relation among three or more variables. In addition, it can be used to show how a variable mediates the relation between levels of intervention and outcome. The Sobel test, developed in 1990, provides a statistical method for determining the influence of a mediator on an intervention or outcome. Although interactive Web-based and stand-alone methods exist for computing the Sobel test, SPSS and SAS programs that automatically run the required regression analyses and computations increase the accessibility of mediation modeling to nursing researchers. To illustrate the utility of the Sobel test and to make this programming available to the Nursing Research audience in both SAS and SPSS. The history, logic, and technical aspects of mediation testing are introduced. The syntax files sobel.sps and sobel.sas, created to automate the computation of the regression analysis and test statistic, are available from the corresponding author. The reported programming allows the user to complete mediation testing with the user's own data in a single-step fashion. A technical manual included with the programming provides instruction on program use and interpretation of the output. Mediation modeling is a useful tool for describing the relation between three or more variables. Programming and manuals for using this model are made available.

  7. Rat gingival model for testing drugs influencing inflammation

    Directory of Open Access Journals (Sweden)

    Shaju P Jacob

    2013-07-01

    Full Text Available Preclinical drug testing is an important areain new drug development where animals are used.An ideal animal model for this is one which is simple,reliable and can be extrapolated to humans. Topicaldrugs for inflammation are conventionally tested onthe skin of animals after induction of inflammation.A gingival model would be simple as inflammation canbe induced naturally by the action of plaque. Rats area popular animal model for testing drugs as well as tostudy various diseases of the periodontium. Periodontaldisease including gingival inflammation develops inrats in relation to indigenous plaque or experimentallyinduced bacterial products. A number of features ofrats ranging from anatomy, histology and response tobacterial insult can be seen mirrored to a great extentin humans. There is a lot similarity in the developmentand resolution of inflammation as well as the gingivalwound healing of rats and humans. This paper tries toexplore the feasibility of using the rat gingival modelfor preclinical testing of drugs acting on or influencinginflammation and concludes by identifying potentialareas of research using this model. The addition of sucha simple and inexpensive model for preclinical testing ofdrugs will be welcomed by the drug developers.

  8. Development of a fault test experimental facility model using Matlab

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez; Moraes, Davi Almeida, E-mail: martinez@ipen.br, E-mail: dmoraes@dk8.com.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The Fault Test Experimental Facility was developed to simulate a PWR nuclear power plant and is instrumented with temperature, level and pressure sensors. The Fault Test Experimental Facility can be operated to generate normal and fault data, and these failures can be added initially small, and their magnitude being increasing gradually. This work presents the Fault Test Experimental Facility model developed using the Matlab GUIDE (Graphical User Interface Development Environment) toolbox that consists of a set of functions designed to create interfaces in an easy and fast way. The system model is based on the mass and energy inventory balance equations. Physical as well as operational aspects are taken into consideration. The interface layout looks like a process flowchart and the user can set the input variables. Besides the normal operation conditions, there is the possibility to choose a faulty variable from a list. The program also allows the user to set the noise level for the input variables. Using the model, data were generated for different operational conditions, both under normal and fault conditions with different noise levels added to the input variables. Data generated by the model will be compared with Fault Test Experimental Facility data. The Fault Test Experimental Facility theoretical model results will be used for the development of a Monitoring and Fault Detection System. (author)

  9. New Model Atmospheres: Testing the Solar Spectrum in the UV

    CERN Document Server

    Rodríguez-Merino, L H; Bertone, E; Chavez, M; Buzzoni, A

    2007-01-01

    We present preliminary results on the calculation of synthetic spectra obtained with the stellar model atmospheres developed by Cardona, Crivellari, and Simonneau. These new models have been used as input within the SYNTHE series of codes developed by Kurucz. As a first step we have tested if SYNTHE is able to handle these models which go down to log tau(Ross)= -13. We have successfully calculated a synthetic solar spectrum in the wavelength region 2000--4500 A at high resolution (R=522,000). Within this initial test we have found that layers at optical depths with log tau(Ross) < -7 significantly affect the mid-UV properties of a synthetic spectrum computed from a solar model. We anticipate that these new extended models will be a valuable tool for the analysis of UV stellar light arising from the outermost layers of the atmospheres.

  10. New Model Atmospheres: Testing the Solar Spectrum in the UV

    Science.gov (United States)

    Rodríguez-Merino, L. H.; Cardona, O.; Bertone, E.; Chávez, M.; Buzzoni, A.

    2009-03-01

    We present preliminary results on the calculation of synthetic spectra obtained with the stellar model atmospheres developed by Cardona, Crivellari, and Simonneau. These new models have been used as input within the Synthe series of codes developed by Kurucz. As a first step we have tested if Synthe is able to handle these models which go down to log{τ_{Ross}}= -13. We have successfully calculated a synthetic solar spectrum in the wavelength region 2000-4500 Å at high resolution (R=522 000). Within this initial test we have found that layers at optical depths with log{τ_{Ross}} < -7 significantly affect the mid-UV properties of a synthetic spectrum computed from a solar model. We anticipate that these new extended models will be a valuable tool for the analysis of UV stellar light arising from the outermost layers of the atmospheres.

  11. APPLYING BLACK-BOX TESTING TO MODEL TRANSFORMATIONS IN THE MODEL DRIVEN ARCHITECTURE CONTEXT

    Directory of Open Access Journals (Sweden)

    Luciane Telinski Wiedermann Agner

    2014-01-01

    Full Text Available Testing model transformations has played a leading role with the dissemination of MDA in software development processes. Software testing based on black-box testing, together with the “category partitioning” method, can be efficiently used in order to conduct the verification of model transformations. This study employs software testing techniques to an ATL model transformation in the MDA context and points out their benefits. The black-box testing method was adapted to the MT-PROAPES model transformation based on profiles and platform models. The platform models define the range of input models of the MT-PROAPES and are used for the creation of the test cases. The test cases were selected so as to meet certain requirements and increase the ability to detect errors in the model transformation. This approach makes the test process more agile and does not require any abstraction of behavioral properties of the transformations. The field of transformation testing and verification still faces significant challenges and requires a lot of research. Although having some limitations, black-box testing conforms to various situations, besides allowing its integration with other test strategies.

  12. Target Soil Impact Verification: Experimental Testing and Kayenta Constitutive Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Broome, Scott Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Flint, Gregory Mark [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Dewers, Thomas [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Newell, Pania [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report details experimental testing and constitutive modeling of sandy soil deformation under quasi - static conditions. This is driven by the need to understand constitutive response of soil to target/component behavior upon impact . An experimental and constitutive modeling program was followed to determine elastic - plastic properties and a compressional failure envelope of dry soil . One hydrostatic, one unconfined compressive stress (UCS), nine axisymmetric compression (ACS) , and one uniaxial strain (US) test were conducted at room temperature . Elastic moduli, assuming isotropy, are determined from unload/reload loops and final unloading for all tests pre - failure and increase monotonically with mean stress. Very little modulus degradation was discernable from elastic results even when exposed to mean stresses above 200 MPa . The failure envelope and initial yield surface were determined from peak stresses and observed onset of plastic yielding from all test results. Soil elasto - plastic behavior is described using the Brannon et al. (2009) Kayenta constitutive model. As a validation exercise, the ACS - parameterized Kayenta model is used to predict response of the soil material under uniaxial strain loading. The resulting parameterized and validated Kayenta model is of high quality and suitable for modeling sandy soil deformation under a range of conditions, including that for impact prediction.

  13. Hygrothermal modeling and testing of polymers and polymer matrix composites

    Science.gov (United States)

    Xu, Weiqun

    2000-10-01

    The dissertation, consisting of four papers, presents the results of the research investigation on environmental effects on polymers and polymer matrix composites. Hygrothermal models were developed that would allow characterization of non-Fickian diffusion coefficients from moisture weight gain data. Hygrothermal testing was also conducted to provide the necessary data for characterizing of model coefficients and model verification. In part 1, a methodology is proposed that would allow characterization of non-Fickian diffusion coefficients from moisture weight gain data for a polymer adhesive below its Tg. Subsequently, these diffusion coefficients are used for predicting moisture concentration profiles through the thickness of a polymer. In part 2, a modeling methodology based on irreversible thermodynamics applied within the framework of composite macro-mechanics is presented, that would allow characterization of non-Fickian diffusion coefficients from moisture weight gain data for laminated composites with distributed uniaxial damage. Comparisons with test data for a 5-harness satin textile composite with uniaxial micro-cracks are provided for model verifications. In part 3, the same modeling methodology based on irreversible thermodynamics is extended to the case of a bi-axially damaged laminate. The model allows characterization of nonFickian diffusion coefficients as well as moisture saturation level from moisture weight gain data for laminates with pre-existing damage. Comparisons with test data for a bi-axially damaged Graphite/Epoxy woven composite are provided for model verifications. Finally, in part 4, hygrothermal tests conducted on AS4/PR500 5HS textile composite laminates are summarized. The objectives of the hygrothermal tests are to determine the diffusivity and maximum moisture content of the laminate.

  14. Asteroid modeling for testing spacecraft approach and landing.

    Science.gov (United States)

    Martin, Iain; Parkes, Steve; Dunstan, Martin; Rowell, Nick

    2014-01-01

    Spacecraft exploration of asteroids presents autonomous-navigation challenges that can be aided by virtual models to test and develop guidance and hazard-avoidance systems. Researchers have extended and applied graphics techniques to create high-resolution asteroid models to simulate cameras and other spacecraft sensors approaching and descending toward asteroids. A scalable model structure with evenly spaced vertices simplifies terrain modeling, avoids distortion at the poles, and enables triangle-strip definition for efficient rendering. To create the base asteroid models, this approach uses two-phase Poisson faulting and Perlin noise. It creates realistic asteroid surfaces by adding both crater models adapted from lunar terrain simulation and multiresolution boulders. The researchers evaluated the virtual asteroids by comparing them with real asteroid images, examining the slope distributions, and applying a surface-relative feature-tracking algorithm to the models.

  15. Some tests for parameter constancy in cointegrated VAR-models

    DEFF Research Database (Denmark)

    Hansen, Henrik; Johansen, Søren

    1999-01-01

    Some methods for the evaluation of parameter constancy in vector autoregressive (VAR) models are discussed. Two different ways of re-estimating the VAR model are proposed; one in which all parameters are estimated recursively based upon the likelihood function for the first observations, and anot...... be applied to test the constancy of the long-run parameters in the cointegrated VAR-model. All results are illustrated using a model for the term structure of interest rates on US Treasury securities. ...

  16. Modeling and Testing of EVs - Preliminary Study and Laboratory Development

    DEFF Research Database (Denmark)

    Yang, Guang-Ya; Marra, Francesco; Nielsen, Arne Hejde

    2010-01-01

    impact at different geographical areas, as well as driving and charging patterns. Electric circuit model is deployed in this work to represent the electrical properties of a lithium-ion battery. This paper reports the preliminary modeling and validation work based on manufacturer data sheet and realistic......Electric vehicles (EVs) are expected to play a key role in the future energy management system to stabilize both supply and consumption with the presence of high penetration of renewable generation. A reasonably accurate model of battery is a key element for the study of EVs behavior and the grid...... tests, followed by the suggestions towards a feasible battery model for further studies....

  17. Goodness-of-fit tests in mixed models

    KAUST Repository

    Claeskens, Gerda

    2009-05-12

    Mixed models, with both random and fixed effects, are most often estimated on the assumption that the random effects are normally distributed. In this paper we propose several formal tests of the hypothesis that the random effects and/or errors are normally distributed. Most of the proposed methods can be extended to generalized linear models where tests for non-normal distributions are of interest. Our tests are nonparametric in the sense that they are designed to detect virtually any alternative to normality. In case of rejection of the null hypothesis, the nonparametric estimation method that is used to construct a test provides an estimator of the alternative distribution. © 2009 Sociedad de Estadística e Investigación Operativa.

  18. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  19. Classical tests of general relativity in brane world models

    CERN Document Server

    Boehmer, Christian G; Harko, Tiberiu; Lobo, Francisco S N

    2009-01-01

    The classical tests of general relativity (perihelion precession, deflection of light, and the radar echo delay) are considered for several spherically symmetric static vacuum solutions in brane world models. Generally, the spherically symmetric vacuum solutions of the brane gravitational field equations have properties quite distinct as compared to the standard black hole solutions of general relativity. As a first step a general formalism that facilitates the analysis of general relativistic Solar System tests for any given spherically symmetric metric is developed. It is shown that the existing observational Solar System data on the perihelion shift of Mercury, on the light bending around the Sun (obtained using long-baseline radio interferometry), and ranging to Mars using the Viking lander, constrain the numerical values of the parameters of the specific models. Hence Solar System tests represent very convenient and efficient tools to test the viability of the different black hole solutions in brane worl...

  20. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-01-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance. PMID:27721505

  1. Initial CGE Model Results Summary Exogenous and Endogenous Variables Tests

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rivera, Michael Kelly [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    The following discussion presents initial results of tests of the most recent version of the National Infrastructure Simulation and Analysis Center Dynamic Computable General Equilibrium (CGE) model developed by Los Alamos National Laboratory (LANL). The intent of this is to test and assess the model’s behavioral properties. The test evaluated whether the predicted impacts are reasonable from a qualitative perspective. This issue is whether the predicted change, be it an increase or decrease in other model variables, is consistent with prior economic intuition and expectations about the predicted change. One of the purposes of this effort is to determine whether model changes are needed in order to improve its behavior qualitatively and quantitatively.

  2. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-10-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance.

  3. Modeling Spin Testing Using Location Specific Material Properties

    Science.gov (United States)

    2012-04-01

    Three stages of creep. 37 2.4.1 Creep Testing A virtual creep test begins with an elastoplastic coupon, and an initial prescribed...and maximum single crystal Schmid factor. This then models the worst case of a potent damaged inclusion cluster located in a group of grains of high...is the maximum plastic shear strain amplitude, while the coefficient K’ takes into account the damaging effect of the local maximum normal stress

  4. A model based security testing method for protocol implementation.

    Science.gov (United States)

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  5. A Model Based Security Testing Method for Protocol Implementation

    Directory of Open Access Journals (Sweden)

    Yu Long Fu

    2014-01-01

    Full Text Available The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  6. Modeling cross-hole slug tests in an unconfined aquifer

    CERN Document Server

    Malama, Bwalya; Brauchler, Ralf; Bayer, Peter

    2016-01-01

    A modified version of a published slug test model for unconfined aquifers is applied to cross-hole slug test data collected in field tests conducted at the Widen site in Switzerland. The model accounts for water-table effects using the linearised kinematic condition. The model also accounts for inertial effects in source and observation wells. The primary objective of this work is to demonstrate applicability of this semi-analytical model to multi-well and multi-level pneumatic slug tests. The pneumatic perturbation was applied at discrete intervals in a source well and monitored at discrete vertical intervals in observation wells. The source and observation well pairs were separated by distances of up to 4 m. The analysis yielded vertical profiles of hydraulic conductivity, specific storage, and specific yield at observation well locations. The hydraulic parameter estimates are compared to results from prior pumping and single-well slug tests conducted at the site, as well as to estimates from particle size ...

  7. Dynamic Modeling, Testing, and Stability Analysis of an Ornithoptic Blimp

    Institute of Scientific and Technical Information of China (English)

    John Dietl; Thomas Herrmann; Gregory Reich; Ephrahim Garcia

    2011-01-01

    In order to study omithopter flight and to improve a dynamic model of flapping propulsion,a series of tests are conducted on a flapping-wing blimp.The blimp is designed and constructed from mylar plastic and balsa wood as a test platform for aerodynamics and flight dynamics.The blimp,2.3 meters long and 420 gram mass,is propelled by its flapping wings.Due to buoyancy the wings have no lift requirement so that the distinction between lift and propulsion can be analyzed in a flight platform at low flight speeds.The blimp is tested using a Vicon motion tracking system and various initial conditions are tested including accelerating flight from standstill,decelerating from an initial speed higher than its steady state,and from its steady-state speed but disturbed in pitch angle.Test results are used to estimate parameters in a coupled quasi-steady aerodynamics/Newtonian flight dynamics model.This model is then analyzed using Floquet theory to determine local dynamic modes and stability.It is concluded that the dynamic model adequately describes the vehicle's nonlinear behavior near the steady-state velocity and that the vehicle's linearized modes are akin to those of a fixed-wing aircraft.

  8. Exact Hypothesis Tests for Log-linear Models with exactLoglinTest

    Directory of Open Access Journals (Sweden)

    Brian Caffo

    2006-11-01

    Full Text Available This manuscript overviews exact testing of goodness of fit for log-linear models using the R package exactLoglinTest. This package evaluates model fit for Poisson log-linear models by conditioning on minimal sufficient statistics to remove nuisance parameters. A Monte Carlo algorithm is proposed to estimate P values from the resulting conditional distribution. In particular, this package implements a sequentially rounded normal approximation and importance sampling to approximate probabilities from the conditional distribution. Usually, this results in a high percentage of valid samples. However, in instances where this is not the case, a Metropolis Hastings algorithm can be implemented that makes more localized jumps within the reference set. The manuscript details how some conditional tests for binomial logit models can also be viewed as conditional Poisson log-linear models and hence can be performed via exactLoglinTest. A diverse battery of examples is considered to highlight use, features and extensions of the software. Notably, potential extensions to evaluating disclosure risk are also considered.

  9. Design verification and cold-flow modeling test report

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-01

    This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, and TRW proprietary information has been excluded.

  10. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  11. Description of Model Tests Carried Out by Aalborg University

    DEFF Research Database (Denmark)

    Frigaard, Peter; Schlütter, F.; Andersen, H.

    1996-01-01

    As associated partner, Aalborg University (AU) have participated in different aspects of "the Zeebrugge project". AU has carried out an extensive number of small-scale model tests (1:65) with the Zeebrugge breakwater with the aim of investigating scale-effects.......As associated partner, Aalborg University (AU) have participated in different aspects of "the Zeebrugge project". AU has carried out an extensive number of small-scale model tests (1:65) with the Zeebrugge breakwater with the aim of investigating scale-effects....

  12. Proton Exchange Membrane Fuel Cell Engineering Model Powerplant. Test Report: Benchmark Tests in Three Spatial Orientations

    Science.gov (United States)

    Loyselle, Patricia; Prokopius, Kevin

    2011-01-01

    Proton exchange membrane (PEM) fuel cell technology is the leading candidate to replace the aging alkaline fuel cell technology, currently used on the Shuttle, for future space missions. This test effort marks the final phase of a 5-yr development program that began under the Second Generation Reusable Launch Vehicle (RLV) Program, transitioned into the Next Generation Launch Technologies (NGLT) Program, and continued under Constellation Systems in the Exploration Technology Development Program. Initially, the engineering model (EM) powerplant was evaluated with respect to its performance as compared to acceptance tests carried out at the manufacturer. This was to determine the sensitivity of the powerplant performance to changes in test environment. In addition, a series of tests were performed with the powerplant in the original standard orientation. This report details the continuing EM benchmark test results in three spatial orientations as well as extended duration testing in the mission profile test. The results from these tests verify the applicability of PEM fuel cells for future NASA missions. The specifics of these different tests are described in the following sections.

  13. Testing GNSS ionosphere models based on the position domain

    Science.gov (United States)

    Orus-Perez, Raul; Rovira, Adria

    2017-04-01

    As is well know, the ionosphere is one of the main contributors to the navigation error of single-frequency users. Currently, there are many models available for correcting the ionosphere delay. Thus, the different GNSS provide its own ionosphere corrections in the Signal-in-Space as for instance, NeQuick G for Galileo or Klobuchar for GPS. Other sources for ionosphere corrections are the Satellite Based Augmentation Systems (i.e. EGNOS or WAAS), Global Ionospheric Maps (i.e. provided by IGS), regional maps and even climatological models, like NeQuick or IRI. With this large variety of models, there have been a lot of efforts to define a suitable strategy to test the accuracy of the different models. Usually, this testing has been done by computing a "reference ionosphere", using all kind of GNSS techniques, using ionosonde data or using altimeter data. These techniques are not bias free and they may raise questions on which is the absolute accuracy they achieve. In order to complement these tests, a new methodology has been developed to test ionosphere models for GNSS. This methodology is based on the position domain, modeling the observables on each frequency with geodetic accuracy, and then to combine the obtained least square solutions to determine the ionosphere error. The results of the testing for different GIMs from IGS and different Signal-in-Space models (GPS, Galileo, and EGNOS) will be presented for 2 years of the last Solar Maximum with more than 40 receivers worldwide. The weaknesses and strengths of the new methodology will also be shown to get a comprehensive idea of its capabilities.

  14. Mathematical modeling of variables involved in dissolution testing.

    Science.gov (United States)

    Gao, Zongming

    2011-11-01

    Dissolution testing is an important technique used for development and quality control of solid oral dosage forms of pharmaceutical products. However, the variability associated with this technique, especially with USP apparatuses 1 and 2, is a concern for both the US Food and Drug Administration and pharmaceutical companies. Dissolution testing involves a number of variables, which can be divided into four main categories: (1) analyst, (2) dissolution apparatus, (3) testing environment, and (4) sample. Both linear and nonlinear models have been used to study dissolution profiles, and various mathematical functions have been used to model the observed data. In this study, several variables, including dissolved gases in the dissolution medium, off-center placement of the test tablet, environmental vibration, and various agitation speeds, were modeled. Mathematical models including Higuchi, Korsmeyer-Peppas, Weibull, and the Noyes-Whitney equation were employed to study the dissolution profile of 10 mg prednisone tablets (NCDA #2) using the USP paddle method. The results showed that the nonlinear models (Korsmeyer-Peppas and Weibull) accurately described the entire dissolution profile. The results also showed that dissolution variables affected dissolution rate constants differently, depending on whether the tablets disintegrated or dissolved.

  15. Stochastic Models for Strength of Wind Turbine Blades using Tests

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2008-01-01

    The structural cost of wind turbine blades is dependent on the values of the partial safety factors which reflect the uncertainties in the design values, including statistical uncertainty from a limited number of tests. This paper presents a probabilistic model for ultimate and fatigue strength...... of wind turbine blades especially considering the influence of prior knowledge and test results and how partial safety factors can be updated when additional full-scale tests are performed. This updating is performed by adopting a probabilistic design basis based on Bayesian statistical methods....

  16. A new test on the conditional capital asset pricing model

    Institute of Scientific and Technical Information of China (English)

    LI Xia-fei; CAI Zong-wu; REN Yu

    2015-01-01

    Testing the validity of the conditional capital asset pricing model (CAPM) is a puzzle in the finance literature. Lewellen and Nagel[14] find that the variation in betas and in the equity premium would have to be implausibly large to explain important asset-pricing anomalies. Unfortunately, they do not provide a rigorous test statistic. Based on a simulation study, the method proposed in Lewellen and Nagel[14] tends to reject the null too frequently. We develop a new test procedure and derive its limiting distribution under the null hypothesis. Also, we provide a Bootstrap approach to the testing procedure to gain a good finite sample performance. Both simulations and empirical studies show that our test is necessary for making correct inferences with the conditional CAPM.

  17. Mathematical models applied in inductive non-destructive testing

    Energy Technology Data Exchange (ETDEWEB)

    Wac-Wlodarczyk, A.; Goleman, R.; Czerwinski, D. [Technical University of Lublin, 20 618 Lublin, Nadbystrzycka St 38a (Poland); Gizewski, T. [Technical University of Lublin, 20 618 Lublin, Nadbystrzycka St 38a (Poland)], E-mail: t.gizewski@pollub.pl

    2008-10-15

    Non-destructive testing are the wide group of investigative methods of non-homogenous material. Methods of computer tomography, ultrasonic, magnetic and inductive methods still developed are widely applied in industry. In apparatus used for non-destructive tests, the analysis of signals is made on the basis of complex system answers. The answer is linearized due to the model of research system. In this paper, the authors will discuss the applications of the mathematical models applied in investigations of inductive magnetic materials. The statistical models and other gathered in similarity classes will be taken into consideration. Investigation of mathematical models allows to choose the correct method, which in consequence leads to precise representation of the inner structure of examined object. Inductive research of conductive media, especially those with ferromagnetic properties, are run with high frequency magnetic field (eddy-currents method), which considerably decrease penetration depth.

  18. Empirical likelihood ratio tests for multivariate regression models

    Institute of Scientific and Technical Information of China (English)

    WU Jianhong; ZHU Lixing

    2007-01-01

    This paper proposes some diagnostic tools for checking the adequacy of multivariate regression models including classical regression and time series autoregression. In statistical inference, the empirical likelihood ratio method has been well known to be a powerful tool for constructing test and confidence region. For model checking, however, the naive empirical likelihood (EL) based tests are not of Wilks' phenomenon. Hence, we make use of bias correction to construct the EL-based score tests and derive a nonparametric version of Wilks' theorem. Moreover, by the advantages of both the EL and score test method, the EL-based score tests share many desirable features as follows: They are self-scale invariant and can detect the alternatives that converge to the null at rate n-1/2, the possibly fastest rate for lack-of-fit testing; they involve weight functions, which provides us with the flexibility to choose scores for improving power performance, especially under directional alternatives. Furthermore, when the alternatives are not directional, we construct asymptotically distribution-free maximin tests for a large class of possible alternatives. A simulation study is carried out and an application for a real dataset is analyzed.

  19. The role of observational uncertainties in testing model hypotheses

    Science.gov (United States)

    Westerberg, I. K.; Birkel, C.

    2012-12-01

    Knowledge about hydrological processes and the spatial and temporal distribution of water resources is needed as a basis for managing water for hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypothesis testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. The aim of this study was to investigate the role of observational uncertainties in hypothesis testing, in particular whether it was possible to detect model-structural representations that were wrong in an important way given the uncertainties in the observational data. We studied the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. We tested several model structures of varying complexity as hypotheses about catchment functioning, but also hypotheses about the nature of the modelling errors. The tests were made within a learning framework for uncertainty estimation which enabled insights into data uncertainties, suitable model-structural representations and appropriate likelihoods. The observational uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, canopy interception, wind-driven rain and the elevation gradient. The hypotheses were evaluated in a posterior analysis of the simulations where the performance of each simulation was analysed relative to the observational uncertainties for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow

  20. Operational Testing of Satellite based Hydrological Model (SHM)

    Science.gov (United States)

    Gaur, Srishti; Paul, Pranesh Kumar; Singh, Rajendra; Mishra, Ashok; Gupta, Praveen Kumar; Singh, Raghavendra P.

    2017-04-01

    Incorporation of the concept of transposability in model testing is one of the prominent ways to check the credibility of a hydrological model. Successful testing ensures ability of hydrological models to deal with changing conditions, along with its extrapolation capacity. For a newly developed model, a number of contradictions arises regarding its applicability, therefore testing of credibility of model is essential to proficiently assess its strength and limitations. This concept emphasizes to perform 'Hierarchical Operational Testing' of Satellite based Hydrological Model (SHM), a newly developed surface water-groundwater coupled model, under PRACRITI-2 program initiated by Space Application Centre (SAC), Ahmedabad. SHM aims at sustainable water resources management using remote sensing data from Indian satellites. It consists of grid cells of 5km x 5km resolution and comprises of five modules namely: Surface Water (SW), Forest (F), Snow (S), Groundwater (GW) and Routing (ROU). SW module (functions in the grid cells with land cover other than forest and snow) deals with estimation of surface runoff, soil moisture and evapotranspiration by using NRCS-CN method, water balance and Hragreaves method, respectively. The hydrology of F module is dependent entirely on sub-surface processes and water balance is calculated based on it. GW module generates baseflow (depending on water table variation with the level of water in streams) using Boussinesq equation. ROU module is grounded on a cell-to-cell routing technique based on the principle of Time Variant Spatially Distributed Direct Runoff Hydrograph (SDDH) to route the generated runoff and baseflow by different modules up to the outlet. For this study Subarnarekha river basin, flood prone zone of eastern India, has been chosen for hierarchical operational testing scheme which includes tests under stationary as well as transitory conditions. For this the basin has been divided into three sub-basins using three flow

  1. Small Scale Drop Tower Test for Practice Torpedo Impact Modelling

    Science.gov (United States)

    2012-06-01

    Table E3. Tensile test results for the aluminium 6061 -T6 model hull plates. 2 coupons were cut with their length parallel to the long side of the...and without stiffeners and aluminium plate without stiffeners. A qualitative comparison of the results for the three model hull forms shows that...stiffeners tend to limit the extent of the dent, that aluminium plate has a greater elastic response than that of both stiffened and unstiffened steel

  2. DIAGNOSTIC TEST FOR GARCH MODELS BASED ON ABSOLUTE RESIDUAL AUTOCORRELATIONS

    Directory of Open Access Journals (Sweden)

    Farhat Iqbal

    2013-10-01

    Full Text Available In this paper the asymptotic distribution of the absolute residual autocorrelations from generalized autoregressive conditional heteroscedastic (GARCH models is derived. The correct asymptotic standard errors for the absolute residual autocorrelations are also obtained and based on these results, a diagnostic test for checking the adequacy of GARCH-type models are developed. Our results do not depend on the existence of higher moments and is therefore robust under heavy-tailed distributions.

  3. DIAGNOSTIC TEST FOR GARCH MODELS BASED ON ABSOLUTE RESIDUAL AUTOCORRELATIONS

    Directory of Open Access Journals (Sweden)

    Farhat Iqbal

    2013-10-01

    Full Text Available In this paper the asymptotic distribution of the absolute residual autocorrelations from generalized autoregressive conditional heteroscedastic (GARCH models is derived. The correct asymptotic standard errors for the absolute residual autocorrelations are also obtained and based on these results, a diagnostic test for checking the adequacy of GARCH-type models are developed. Our results do not depend on the existence of higher moments and is therefore robust under heavy-tailed distributions.

  4. Model of risk assessment under ballistic statistical tests

    Science.gov (United States)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  5. Twodimensional model testing of the Zeebrugge NW Breakwater

    OpenAIRE

    Wens, F.; De Rouck, J.; P. A. Troch; Eelen, B.

    1996-01-01

    Breakwater design at this moment is based on physical scale modelling combined with the use of experimental design formulae. These formulae are all derived from scale model tests together with simplifying the theoretical assumptions. Several breakwaters failures in the past indicate that the state of the art on this subject suffers from a significant lack of full scale data which could give rise to safer design standards. Within the MAST II framework of the EU (Marine Science and Technology),...

  6. Updating the Finite Element Model of the Aerostructures Test Wing using Ground Vibration Test Data

    Science.gov (United States)

    Lung, Shun-fat; Pak, Chan-gi

    2009-01-01

    Improved and/or accelerated decision making is a crucial step during flutter certification processes. Unfortunately, most finite element structural dynamics models have uncertainties associated with model validity. Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. The model tuning process requires not only satisfactory correlations between analytical and experimental results, but also the retention of the mass and stiffness properties of the structures. Minimizing the difference between analytical and experimental results is a type of optimization problem. By utilizing the multidisciplinary design, analysis, and optimization (MDAO) tool in order to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes can be matched to the target data to retain the mass matrix orthogonality. This approach has been applied to minimize the model uncertainties for the structural dynamics model of the Aerostructures Test Wing (ATW), which was designed and tested at the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (DFRC) (Edwards, California). This study has shown that natural frequencies and corresponding mode shapes from the updated finite element model have excellent agreement with corresponding measured data.

  7. The Neuman Systems Model Institute: testing middle-range theories.

    Science.gov (United States)

    Gigliotti, Eileen

    2003-07-01

    The credibility of the Neuman systems model can only be established through the generation and testing of Neuman systems model-derived middle-range theories. However, due to the number and complexity of Neuman systems model concepts/concept interrelations and the diversity of middle-range theory concepts linked to these Neuman systems model concepts by researchers, no explicit middle-range theories have yet been derived from the Neuman systems model. This article describes the development of an organized program for the systematic study of the Neuman systems model. Preliminary work, already accomplished, is detailed, and a tentative plan for the completion of further preliminary work as well as beginning the actual research conduction phase is proposed.

  8. Testing and Modeling of Contact Problems in Resistance Welding

    DEFF Research Database (Denmark)

    Song, Quanfeng

    As a part of the efforts towards a professional and reliable numerical tool for resistance welding engineers, this Ph.D. project is dedicated to refining the numerical models related to the interface behavior. An FE algorithm for the contact problems in resistance welding has been developed...... together two or three cylindrical parts as well as disc-ring pairs of dissimilar metals. The tests have demonstrated the effectiveness of the model. A theoretical and experimental study is performed on the contact resistance aiming at a more reliable model for numerical simulation of resistance welding....... The model currently employed is evaluated. It is found that the model may underestimate the constriction resistance because it is based on the assumption of continual contact area. A new model is proposed on the constriction resistance in resistance welding. A parametric study is performed on the contact...

  9. Animal models for dengue vaccine development and testing.

    Science.gov (United States)

    Na, Woonsung; Yeom, Minjoo; Choi, Il-Kyu; Yook, Heejun; Song, Daesub

    2017-07-01

    Dengue fever is a tropical endemic disease; however, because of climate change, it may become a problem in South Korea in the near future. Research on vaccines for dengue fever and outbreak preparedness are currently insufficient. In addition, because there are no appropriate animal models, controversial results from vaccine efficacy assessments and clinical trials have been reported. Therefore, to study the mechanism of dengue fever and test the immunogenicity of vaccines, an appropriate animal model is urgently needed. In addition to mouse models, more suitable models using animals that can be humanized will need to be constructed. In this report, we look at the current status of model animal construction and discuss which models require further development.

  10. Reducing the Cost of Model-Based Testing through Test Case Diversity

    Science.gov (United States)

    Hemmati, Hadi; Arcuri, Andrea; Briand, Lionel

    Model-based testing (MBT) suffers from two main problems which in many real world systems make MBT impractical: scalability and automatic oracle generation. When no automated oracle is available, or when testing must be performed on actual hardware or a restricted-access network, for example, only a small set of test cases can be executed and evaluated. However, MBT techniques usually generate large sets of test cases when applied to real systems, regardless of the coverage criteria. Therefore, one needs to select a small enough subset of these test cases that have the highest possible fault revealing power. In this paper, we investigate and compare various techniques for rewarding diversity in the selected test cases as a way to increase the likelihood of fault detection. We use a similarity measure defined on the representation of the test cases and use it in several algorithms that aim at maximizing the diversity of test cases. Using an industrial system with actual faults, we found that rewarding diversity leads to higher fault detection compared to the techniques commonly reported in the literature: coverage-based and random selection. Among the investigated algorithms, diversification using Genetic Algorithms is the most cost-effective technique.

  11. Fiber pull-out test and single fiber fragmentation test - analysis and modelling

    DEFF Research Database (Denmark)

    Sørensen, Bent F.; Lilholt, Hans

    2016-01-01

    for fiber/matrix debonding and a frictional sliding shear stress. Results for the debond length and fiber debond displacement are compared with results from similar models for single fiber pull-out experiments where the specimen is gripped at the end opposite to the end where the fiber is pulling......-out and with results for a single fiber fragmentation test....

  12. Stochastic order in dichotomous item response models for fixed tests, research adaptive tests, or multiple abilities

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1995-01-01

    Dichotomous item response theory (IRT) models can be viewed as families of stochastically ordered distributions of responses to test items. This paper explores several properties of such distributiom. The focus is on the conditions under which stochastic order in families of conditional distribution

  13. Testing ocean tide models using GGP superconducting gravimeter observations

    Science.gov (United States)

    Baker, T.; Bos, M.

    2003-04-01

    Observations from the global network of superconducting gravimeters in the Global Geodynamics Project (GGP) are used to test 10 ocean tide models (SCHW; FES94.1, 95.2, 98, 99; CSR3.0, 4.0; TPXO.5; GOT99.2b; and NAO.99b). In addition, observations are used from selected sites with LaCoste and Romberg gravimeters with electrostatic feedback, where special attention has been given to achieving a calibration accuracy of 0.1%. In Europe, there are several superconducting gravimeter stations in a relatively small area and this can be used to advantage in testing the ocean (and body) tide models and in identifying sites with anomalous observations. At some of the superconducting gravimeter sites there are anomalies in the in-phase components of the main tidal harmonics, which are due to calibration errors of up to 0.3%. It is shown that the recent ocean tide models are in better agreement with the tidal gravity observations than were the earlier models of Schwiderski and FES94.1. However, no single ocean tide model gives completely satisfactory results in all areas of the world. For example, for M2 the TPXO.5 and NAO99b models give anomalous results in Europe, whereas the FES95.2, FES98 and FES99 models give anomalous results in China and Japan. It is shown that the observations from this improved set of tidal gravity stations will provide an important test of the new ocean tide models that will be developed in the next few years. For further details see Baker, T.F. and Bos, M.S. (2003). "Validating Earth and ocean tide models using tidal gravity measurements", Geophysical Journal International, 152.

  14. Computerized Classification Testing under the Generalized Graded Unfolding Model

    Science.gov (United States)

    Wang, Wen-Chung; Liu, Chen-Wei

    2011-01-01

    The generalized graded unfolding model (GGUM) has been recently developed to describe item responses to Likert items (agree-disagree) in attitude measurement. In this study, the authors (a) developed two item selection methods in computerized classification testing under the GGUM, the current estimate/ability confidence interval method and the cut…

  15. Toward a pragmatic migraine model for drug testing

    DEFF Research Database (Denmark)

    Hansen, Emma Katrine; Guo, Song; Ashina, Messoud

    2016-01-01

    BACKGROUND: A model for the testing of novel antimigraine drugs should ideally use healthy volunteers for ease of recruiting. Cilostazol provokes headache in healthy volunteers with some migraine features such as pulsating pain quality and aggravation by physical activity. Therefore, this headache...

  16. Precision tests of quantum chromodynamics and the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Brodsky, S.J.; Lu, H.J.

    1995-06-01

    The authors discuss three topics relevant to testing the Standard Model to high precision: commensurate scale relations, which relate observables to each other in perturbation theory without renormalization scale or scheme ambiguity, the relationship of compositeness to anomalous moments, and new methods for measuring the anomalous magnetic and quadrupole moments of the W and Z.

  17. A test of Berggren's model of dental fear and anxiety

    NARCIS (Netherlands)

    de Jongh, A.; Schutjes, M.; Aartman, I.H.A.

    2011-01-01

    Berggren’s (1984) model of dental fear and anxiety predicts that dentally anxious individuals postpone treatment, leading to a deteriorating dental state and subsequently to fear of negative evaluations in relation to their oral condition. The present study aimed to test one of the core assumptions

  18. PREDETERMINATION OF NATURAL ILLUMINATION BY THE MODEL TESTING METHOD.

    Science.gov (United States)

    PENA, WILLIAM A.

    NEW EDUCATIONAL SPECIFICATIONS HAVE CAUSED ARCHITECTS TO USE NEW FORMS WITH THEIR RESULTING NATURAL LIGHTING PROBLEMS. THE PROBLEM CAN BE ENGINEERED WITH THE USE OF MODELS. PREDICTION OF LIGHTING PERFORMANCE IN A BUILDING CAN BE MADE EARLY IN PLANNING. THIS METHOD PROVIDES FOR THE TESTING OF A VARIETY OF TRIAL SCHEMES ECONOMICALLY AND RAPIDLY.…

  19. Interactive comparison of hypothesis tests for statistical model checking

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; Reijsbergen, D.P.; Scheinhardt, Willem R.W.

    2015-01-01

    We present a web-based interactive comparison of hypothesis tests as are used in statistical model checking, providing users and tool developers with more insight into their characteristics. Parameters can be modified easily and their influence is visualized in real time; an integrated simulation

  20. Impression Formation and Modifiability: Testing a Theoretical Model

    Science.gov (United States)

    Mrug, Sylvie; Hoza, Betsy

    2007-01-01

    This study proposed and tested a developmental model of impression formation based on observed behavior, prior expectancies, and additional incongruent information. Participants were 51 kindergartners, 53 second graders, and 104 college students who provided trait and liking judgments after watching a child actor engage in behaviors from three…

  1. Project Physics Tests 5, Models of the Atom.

    Science.gov (United States)

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    Test items relating to Project Physics Unit 5 are presented in this booklet. Included are 70 multiple-choice and 23 problem-and-essay questions. Concepts of atomic model are examined on aspects of relativistic corrections, electron emission, photoelectric effects, Compton effect, quantum theories, electrolysis experiments, atomic number and mass,…

  2. Testing exact rational expectations in cointegrated vector autoregressive models

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    1999-01-01

    This paper considers the testing of restrictions implied by rational expectations hypotheses in a cointegrated vector autoregressive model for I(1) variables. If the rational expectations involve one-step-ahead observations only and the coefficients are known, an explicit parameterization...

  3. Transfer as a two-way process: testing a model

    NARCIS (Netherlands)

    Vermeulen, R.; Admiraal, W.

    2009-01-01

    Purpose - The purpose of this exploratory research is to test the model of training transfer as a two-way process. Design/methodology/approach - Based on self-report data gathered from 58 to 44 respondents in a field experiment, it is argued that there is not just learning in the context of training

  4. Investigating the characteristics of shutoff valves by model tests

    Energy Technology Data Exchange (ETDEWEB)

    Strohmer, F.

    1977-07-01

    High pressures, strict safety requirements, minimum wear and a decrease of head losses are nowadays the most essential criteria in the design and manufacture of shutoff valves for water powerplants. In the following, the results of such model tests carried out in the hydraulic laboratory of Voeest Alpine AG are described.

  5. A New Approach for Testing the Rasch Model

    Science.gov (United States)

    Kubinger, Klaus D.; Rasch, Dieter; Yanagida, Takuya

    2011-01-01

    Though calibration of an achievement test within psychological and educational context is very often carried out by the Rasch model, data sampling is hardly designed according to statistical foundations. However, Kubinger, Rasch, and Yanagida (2009) recently suggested an approach for the determination of sample size according to a given Type I and…

  6. Centrifugal Model Tests on Railway Embankments of Expansive Soils

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Based on the centrifugal model tests on railway embankments of expansive soil in Nanning-Kunming railway,the author studied several embankments under different physical conditions. The stress and strain states and settlement of the embankments were analyzed, and the obtained results can be used as a reference to field construction.

  7. Modelling, Construction, and Testing of a Simple HTS Machine Demonstrator

    DEFF Research Database (Denmark)

    Jensen, Bogi Bech; Abrahamsen, Asger Bech

    2011-01-01

    This paper describes the construction, modeling and experimental testing of a high temperature superconducting (HTS) machine prototype employing second generation (2G) coated conductors in the field winding. The prototype is constructed in a simple way, with the purpose of having an inexpensive w...

  8. Optical Test of Local Hidden-Variable Model

    Institute of Scientific and Technical Information of China (English)

    WU XiaoHua; ZONG HongShi; PANG HouRong

    2001-01-01

    An inequality is deduced from local realism and a supplementary assumption. This inequality defines an experiment that can be actually performed with the present technology to test local hidden-variable models, and it is violated by quantum mechanics with a factor 1.92, while it can be simplified into a form where just two measurements are required.``

  9. A semianalytic model for the productivity testing of multiple wells

    NARCIS (Netherlands)

    Fokker, P.A.; Verga, F.

    2008-01-01

    We present a semianalytic method for modeling the productivity testing of vertical, horizontal, slanted, or multilateral wells. The method is applicable to both oil and gas reservoirs and automatically accounts for well interference. The use of analytic expressions ensures that short-time transient

  10. JTorX: Exploring Model-Based Testing

    NARCIS (Netherlands)

    Belinfante, Axel

    2014-01-01

    The overall goal of the work described in this thesis is: ``To design a flexible tool for state-of-the-art model-based derivation and automatic application of black-box tests for reactive systems, usable both for education and outside an academic context.'' From this goal, we derive functional and

  11. Façade fire tests – measurements and modeling

    Directory of Open Access Journals (Sweden)

    Anderson Johan

    2013-11-01

    Full Text Available In two recent papers [1, 2] the fire dynamics in a test rig for façade constructions according to the test method SP Brand 105 [3, 4] was investigated both experimentally and numerically. The experimental setup simulates a three-story apartment building (height 6.7 m, width 4 m and depth 1.6 m, with external wall-cladding and a “room fire” at the base. The numerical model was developed in the CFD program Fire Dynamics Simulator (FDS [5] with analogous geometry and instrumentation. The general features of the fire test were well reproduced in the numerical model however temperatures close to the fire source could not be properly accounted for in the model. In this paper the bi-directional probe measurements are elaborated on and the test used in Ref. [1] is revisited using different heat release rates in the numerical model. The velocity of the hot gases along the façade was well reproduced by the simulations although some deviations were found.

  12. Testing and modeling of rockfill materials:A review

    Institute of Scientific and Technical Information of China (English)

    Yang Xiao; Hong Liu; Wengang Zhang; Hanlong Liu; Feng Yin; Youyu Wang

    2016-01-01

    The research development of rockfill materials (RFM) was investigated by a series of large-scale triaxial tests. It is observed that confining pressure and particle breakage play important roles in the mechanical property, dilatancy relation and constitutive model of RFM. In addition, it is observed that the conven-tional dilatancy relation and constitutive model are not suitable for RFM due to the complex mechanical behavior. Hence, it needs to propose a unified constitutive model of RFM, considering the state-dependent and particle breakage behavior.

  13. Cognitive mechanisms of mindfulness: A test of current models.

    Science.gov (United States)

    Isbel, Ben; Mahar, Doug

    2015-12-15

    Existing models of mindfulness describe the self-regulation of attention as primary, leading to enhanced decentering and ability to access and override automatic cognitive processes. This study compared 23 experienced and 21 non-meditators on tests of mindfulness, attention, decentering, and ability to override automatic cognitive processes to test the cognitive mechanisms proposed to underlie mindfulness practice. Experienced meditators had significantly higher mindfulness and decentering than non-meditators. No significant difference between groups was found on measures of attention or ability to override automatic processes. These findings support the prediction that mindfulness leads to enhanced decentering, but do not support the cognitive mechanisms proposed to underlie such enhancement. Since mindfulness practice primarily involves internally directed attention, it may be the case that cognitive tests requiring externally directed attention and timed responses do not accurately assess mindfulness-induced cognitive changes. Implications for the models of mindfulness and future research are discussed.

  14. Modeling and simulation of ultrasonic testing on miniature wheelset

    Science.gov (United States)

    Makino, Kazunari; Biwa, Shiro; Sakamoto, Hiroshi

    2012-05-01

    An ultrasonic testing was carried out for a fatigue crack with the depth of 3.5 mm which was developed on the surface of the wheel seat of a miniature wheelset test piece by the rotating bending fatigue test. The decrease of the flawecho height in case with the wheel to that without the wheel was 13.1 dB when using a 2 MHz probe. We applied a model referred to as the "acoustic impedance adjustment model" to the axle-wheel interface and performed a finite element analysis of the ultrasound propagation. The calculation result showed that the decrease was 10.6 dB, which differed slightly from the experimental result. We discussed the difference of these results.

  15. Material model calibration through indentation test and stochastic inverse analysis

    CERN Document Server

    Buljak, Vladimir

    2015-01-01

    Indentation test is used with growing popularity for the characterization of various materials on different scales. Developed methods are combining the test with computer simulation and inverse analyses to assess material parameters entering into constitutive models. The outputs of such procedures are expressed as evaluation of sought parameters in deterministic sense, while for engineering practice it is desirable to assess also the uncertainty which affects the final estimates resulting from various sources of errors within the identification procedure. In this paper an experimental-numerical method is presented centered on inverse analysis build upon data collected from the indentation test in the form of force-penetration relationship (so-called indentation curve). Recursive simulations are made computationally economical by an a priori model reduction procedure. Resulting inverse problem is solved in a stochastic context using Monte Carlo simulations and non-sequential Extended Kalman filter. Obtained re...

  16. A Coupled THMC model of FEBEX mock-up test

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Liange; Samper, Javier

    2008-09-15

    FEBEX (Full-scale Engineered Barrier EXperiment) is a demonstration and research project for the engineered barrier system (EBS) of a radioactive waste repository in granite. It includes two full-scale heating and hydration tests: the in situ test performed at Grimsel (Switzerland) and a mock-up test operating at CIEMAT facilities in Madrid (Spain). The mock-up test provides valuable insight on thermal, hydrodynamic, mechanical and chemical (THMC) behavior of EBS because its hydration is controlled better than that of in situ test in which the buffer is saturated with water from the surrounding granitic rock. Here we present a coupled THMC model of the mock-up test which accounts for thermal and chemical osmosis and bentonite swelling with a state-surface approach. The THMC model reproduces measured temperature and cumulative water inflow data. It fits also relative humidity data at the outer part of the buffer, but underestimates relative humidities near the heater. Dilution due to hydration and evaporation near the heater are the main processes controlling the concentration of conservative species while surface complexation, mineral dissolution/precipitation and cation exchanges affect significantly reactive species as well. Results of sensitivity analyses to chemical processes show that pH is mostly controlled by surface complexation while dissolved cations concentrations are controlled by cation exchange reactions.

  17. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  18. Parametric Thermal Models of the Transient Reactor Test Facility (TREAT)

    Energy Technology Data Exchange (ETDEWEB)

    Bradley K. Heath

    2014-03-01

    This work supports the restart of transient testing in the United States using the Department of Energy’s Transient Reactor Test Facility at the Idaho National Laboratory. It also supports the Global Threat Reduction Initiative by reducing proliferation risk of high enriched uranium fuel. The work involves the creation of a nuclear fuel assembly model using the fuel performance code known as BISON. The model simulates the thermal behavior of a nuclear fuel assembly during steady state and transient operational modes. Additional models of the same geometry but differing material properties are created to perform parametric studies. The results show that fuel and cladding thermal conductivity have the greatest effect on fuel temperature under the steady state operational mode. Fuel density and fuel specific heat have the greatest effect for transient operational model. When considering a new fuel type it is recommended to use materials that decrease the specific heat of the fuel and the thermal conductivity of the fuel’s cladding in order to deal with higher density fuels that accompany the LEU conversion process. Data on the latest operating conditions of TREAT need to be attained in order to validate BISON’s results. BISON’s models for TREAT (material models, boundary convection models) are modest and need additional work to ensure accuracy and confidence in results.

  19. Tests of scanning model observers for myocardial SPECT imaging

    Science.gov (United States)

    Gifford, H. C.; Pretorius, P. H.; Brankov, J. G.

    2009-02-01

    Many researchers have tested and applied human-model observers as part of their evaluations of reconstruction methods for SPECT perfusion imaging. However, these model observers have generally been limited to signal-known- exactly (SKE) detection tasks. Our objective is to formulate and test scanning model observers that emulate humans in detection-localization tasks involving perfusion defects. Herein, we compare several models based on the channelized nonprewhitening (CNPW) observer. Simulated Tc-99m images of the heart with and without defects were created using a mathematical anthropomorphic phantom. Reconstructions were performed with an iterative algorithm and postsmoothed with a 3D Gaussian filter. Human and model-observer studies were conducted to assess the optimal number of iterations and the smoothing level of the filter. The human-observer study was a multiple-alternative forced-choice (MAFC) study with five defects. The CNPW observer performed the MAFC study, but also performed an SKE-but-variable (SKEV) study and a localization ROC (LROC) study. A separate LROC study applied an observer based on models of human search in mammograms. The amount of prior knowledge about the possible defects differed for these four model-observer studies. The trend was towards improved agreement with the human observers as prior knowledge decreased.

  20. Cyber-Physical Energy Systems Modeling, Test Specification, and Co-Simulation Based Testing

    DEFF Research Database (Denmark)

    van der Meer, A. A.; Palensky, P.; Heussen, Kai

    2017-01-01

    The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is require....... The presented method addresses most modeling and specification challenges in cyber-physical energy systems and is extensible for future additions such as uncertainty quantification.......The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is required....... Taking existing (quasi-) standardised smart grid system and test specification methods as a starting point, we are developing a holistic testing and validation approach that allows a very flexible way of assessing the system level aspects by various types of experiments (including virtual, real...

  1. Testing for gene-gene interaction with AMMI models.

    Science.gov (United States)

    Barhdadi, Amina; Dubé, Marie-Pierre

    2010-01-01

    Studies have shown that many common diseases are influenced by multiple genes and their interactions. There is currently a strong interest in testing for association between combinations of these genes and disease, in particular because genes that affect the risk of disease only in the presence of another genetic variant may not be detected in marginal analysis. In this paper we propose the use of additive main effect and multiplicative interaction (AMMI) models to detect and to quantify gene-gene interaction effects for a quantitative trait. The objective of the present research is to demonstrate the practical advantages of these models to describe complex interaction between two unlinked loci. Although gene-gene interactions have often been defined as a deviance from additive genetic effects, the residual term has generally not been appropriately treated. The AMMI models allow for the analysis of a two way factorial data structure and combine the analysis of variance of the two main genotype effects with a principal component analysis of the residual multiplicative interaction. The AMMI models for gene-gene interaction presented here allow for the testing of non additivity between the two loci, and also describe how their interaction structure fits the existing non-additivity. Moreover, these models can be used to identify the specific two genotypes combinations that contribute to the significant gene-gene interaction. We describe the use of the biplot to display the structure of the interaction and evaluate the performance of the AMMI and the special cases of the AMMI previously described by Tukey and Mandel with simulated data sets. Our simulated study showed that the AMMI model is as powerful as general linear models when the interaction is not modeled in the presence of marginal effects. However, in the presence of pure epitasis, i.e. in the absence of marginal effects, the AMMI method was not found to be superior to other tested regression methods.

  2. Development, testing, and numerical modeling of a foam sandwich biocomposite

    Science.gov (United States)

    Chachra, Ricky

    This study develops a novel sandwich composite material using plant based materials for potential use in nonstructural building applications. The face sheets comprise woven hemp fabric and a sap based epoxy, while the core comprises castor oil based foam with waste rice hulls as reinforcement. Mechanical properties of the individual materials are tested in uniaxial compression and tension for the foam and hemp, respectively. The sandwich composite is tested in 3 point bending. Flexural results are compared to a finite element model developed in the commercial software Abaqus, and the validated model is then used to investigate alternate sandwich geometries. Sandwich model responses are compared to existing standards for nonstructural building panels, showing that the novel material is roughly half the strength of equally thick drywall. When space limitations are not an issue, a double thickness sandwich biocomposite is found to be a structurally acceptable replacement for standard gypsum drywall.

  3. Simple Numerical Model to Simulate Penetration Testing in Unsaturated Soils

    Directory of Open Access Journals (Sweden)

    Jarast S. Pegah

    2016-01-01

    Full Text Available Cone penetration test in unsaturated sand is modelled numerically using Finite Element Method. Simple elastic-perfectly plastic Mohr-Coulomb constitutive model is modified with an apparent cohesion to incorporate the effect of suction on cone resistance. The Arbitrary Lagrangian-Eulerian (ALE remeshing algorithm is also implemented to avoid mesh distortion problem due to the large deformation in the soil around the cone tip. The simulated models indicate that the cone resistance was increased consistently under higher suction or lower degree of saturation. Sensitivity analysis investigating the effect of input soil parameters on the cone tip resistance shows that unsaturated soil condition can be adequately modelled by incorporating the apparent cohesion concept. However, updating the soil stiffness by including a suction-dependent effective stress formula in Mohr-Coulomb material model does not influence the cone resistance significantly.

  4. Model Driven Mutation Applied to Adaptative Systems Testing

    CERN Document Server

    Bartel, Alexandre; Munoz, Freddy; Klein, Jacques; Mouelhi, Tejeddine; Traon, Yves Le

    2012-01-01

    Dynamically Adaptive Systems modify their behav- ior and structure in response to changes in their surrounding environment and according to an adaptation logic. Critical sys- tems increasingly incorporate dynamic adaptation capabilities; examples include disaster relief and space exploration systems. In this paper, we focus on mutation testing of the adaptation logic. We propose a fault model for adaptation logics that classifies faults into environmental completeness and adaptation correct- ness. Since there are several adaptation logic languages relying on the same underlying concepts, the fault model is expressed independently from specific adaptation languages. Taking benefit from model-driven engineering technology, we express these common concepts in a metamodel and define the operational semantics of mutation operators at this level. Mutation is applied on model elements and model transformations are used to propagate these changes to a given adaptation policy in the chosen formalism. Preliminary resul...

  5. Vertimill™ pilot scale tests simulated by perfect mixing model

    Directory of Open Access Journals (Sweden)

    Douglas Batista Mazzinghy

    2014-07-01

    Full Text Available Minas-Rio Project, Anglo American property, located in Brazil, considers Vertimill™ to make the particle size distribution adequate to feed slurry pipeline. A pilot test campaign was carried out at Metso's pilot plant facility located in York city, Pennsylvania State, USA, to provide information to scale up the industrial grinding circuit. The perfect mixing model, normally used to simulate ball mills, was used to compare the direct and reverse circuit configurations. The simulations were based on the appearance function determined from the laboratory tests using a batch tube mill. The combined breakage rate/discharge rate function (r/d was determined from Vertimill™ feed and product particle size distributions obtained from pilot tests. The residence time was estimated considering the mill hold-up and solids flow rate. The simulation results show that there are no significant differences between direct and reverse circuits for the sample tested.

  6. Review of the ATLAS B0 model coil test program

    CERN Document Server

    Dolgetta, N; Acerbi, E; Berriaud, C; Boxman, H; Broggi, F; Cataneo, F; Daël, A; Delruelle, N; Dudarev, A; Foussat, A; Haug, F; ten Kate, H H J; Mayri, C; Paccalini, A; Pengo, R; Rivoltella, G; Sbrissa, E

    2004-01-01

    The ATLAS B0 model coil has been extensively tested, reproducing the operational conditions of the final ATLAS Barrel Toroid coils. Two test campaigns have taken place on B0, at the CERN facility where the individual BT coils are about to be tested. The first campaign aimed to test the cool-down, warm-up phases and to commission the coil up to its nominal current of 20.5 kA, reproducing Lorentz forces similar to the ones on the BT coil. The second campaign aimed to evaluate the margins above the nominal conditions. The B0 was tested up to 24 kA and specific tests were performed to assess: the coil temperature margin with respect to the design value, the performance of the double pancake internal joints, static and dynamic heat loads, behavior of the coil under quench conditions. The paper reviews the overall test program with emphasis on second campaign results not covered before. 10 Refs.

  7. Model year 2010 Ford Fusion Level-1 testing report.

    Energy Technology Data Exchange (ETDEWEB)

    Rask, E.; Bocci, D.; Duoba, M.; Lohse-Busch, H.; Energy Systems

    2010-11-23

    As a part of the US Department of Energy's Advanced Vehicle Testing Activity (AVTA), a model year 2010 Ford Fusion was procured by eTec (Phoenix, AZ) and sent to ANL's Advanced Powertrain Research Facility for the purposes of vehicle-level testing in support of the Advanced Vehicle Testing Activity. Data was acquired during testing using non-intrusive sensors, vehicle network information, and facilities equipment (emissions and dynamometer). Standard drive cycles, performance cycles, steady-state cycles, and A/C usage cycles were conducted. Much of this data is openly available for download in ANL's Downloadable Dynamometer Database. The major results are shown in this report. Given the benchmark nature of this assessment, the majority of the testing was done over standard regulatory cycles and sought to obtain a general overview of how the vehicle performs. These cycles include the US FTP cycle (Urban) and Highway Fuel Economy Test cycle as well as the US06, a more aggressive supplemental regulatory cycle. Data collection for this testing was kept at a fairly high level and includes emissions and fuel measurements from an exhaust emissions bench, high-voltage and accessory current/voltage from a DC power analyzer, and CAN bus data such as engine speed, engine load, and electric machine operation. The following sections will seek to explain some of the basic operating characteristics of the MY2010 Fusion and provide insight into unique features of its operation and design.

  8. An approach to model based testing of multiagent systems.

    Science.gov (United States)

    Ur Rehman, Shafiq; Nadeem, Aamer

    2015-01-01

    Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion.

  9. An Approach to Model Based Testing of Multiagent Systems

    Directory of Open Access Journals (Sweden)

    Shafiq Ur Rehman

    2015-01-01

    Full Text Available Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion.

  10. Testing galaxy formation models with galaxy stellar mass functions

    CERN Document Server

    Lim, Seunghwan; Lan, Ting-Wen; Ménard, Brice

    2016-01-01

    We compare predictions of a number of empirical models and numerical simulations of galaxy formation to the conditional stellar mass functions (CSMF) of galaxies in groups of different masses obtained recently by Lan et al. to test how well different models accommodate the data. Among all the models considered, only the model of Lu et al. can match the observational data; all other models fail to reproduce the faint-end upturn seen in the observation. The CSMFs are used to update the halo-based empirical model of Lu et al., and the model parameters obtained are very similar to those inferred by Lu et al. from a completely different set of observational constraints. The observational data clearly prefer a model in which star formation in low-mass halos changes behavior at a characteristic redshift $z_c \\sim 2$. There is also tentative evidence that this characteristic redshift depends on environments, becoming $z_c \\sim 4$ in regions that eventually evolve into rich clusters of galaxies. The constrained model ...

  11. A Modeling Language Based on UML for Modeling Simulation Testing System of Avionic Software

    Institute of Scientific and Technical Information of China (English)

    WANG Lize; LIU Bin; LU Minyan

    2011-01-01

    With direct expression of individual application domain patterns and ideas, domain-specific modeling language (DSML) is more and more frequently used to build models instead of using a combination of one or more general constructs. Based on the profile mechanism of unified modeling language (UML) 2.2, a kind of DSML is presented to model simulation testing systems of avionic software (STSAS). To define the syntax, semantics and notions of the DSML, the domain model of the STSAS from which we generalize the domain concepts and relationships among these concepts is given, and then, the domain model is mapped into a UML meta-model, named UML-STSAS profile. Assuming a flight control system (FCS) as system under test (SUT), we design the relevant STSAS. The results indicate that extending UML to the simulation testing domain can effectively and precisely model STSAS.

  12. The Latent Class Model as a Measurement Model for Situational Judgment Tests

    Directory of Open Access Journals (Sweden)

    Frank Rijmen

    2011-11-01

    Full Text Available In a situational judgment test, it is often debatable what constitutes a correct answer to a situation. There is currently a multitude of scoring procedures. Establishing a measurement model can guide the selection of a scoring rule. It is argued that the latent class model is a good candidate for a measurement model. Two latent class models are applied to the Managing Emotions subtest of the Mayer, Salovey, Caruso Emotional Intelligence Test: a plain-vanilla latent class model, and a second-order latent class model that takes into account the clustering of several possible reactions within each hypothetical scenario of the situational judgment test. The results for both models indicated that there were three subgroups characterised by the degree to which differentiation occurred between possible reactions in terms of perceived effectiveness. Furthermore, the results for the second-order model indicated a moderate cluster effect.

  13. Correlation Results for a Mass Loaded Vehicle Panel Test Article Finite Element Models and Modal Survey Tests

    Science.gov (United States)

    Maasha, Rumaasha; Towner, Robert L.

    2012-01-01

    High-fidelity Finite Element Models (FEMs) were developed to support a recent test program at Marshall Space Flight Center (MSFC). The FEMs correspond to test articles used for a series of acoustic tests. Modal survey tests were used to validate the FEMs for five acoustic tests (a bare panel and four different mass-loaded panel configurations). An additional modal survey test was performed on the empty test fixture (orthogrid panel mounting fixture, between the reverb and anechoic chambers). Modal survey tests were used to test-validate the dynamic characteristics of FEMs used for acoustic test excitation. Modal survey testing and subsequent model correlation has validated the natural frequencies and mode shapes of the FEMs. The modal survey test results provide a basis for the analysis models used for acoustic loading response test and analysis comparisons

  14. Mouse models for pre-clinical drug testing in leukemia.

    Science.gov (United States)

    Bhatia, Sanil; Daschkey, Svenja; Lang, Franziska; Borkhardt, Arndt; Hauer, Julia

    2016-11-01

    The development of novel drugs which specifically target leukemic cells, with the overall aim to increase complete remission and to reduce toxicity and morbidity, is the most important prerequisite for modern leukemia treatment. In this regard, the current transition rate of potential novel drugs from bench to bedside is remarkably low. Although many novel drugs show promising data in vitro and in vivo, testing of these medications in clinical phase I trials is often sobering with intolerable toxic side effects leading to failure in FDA approval. Areas covered: In this review, the authors discuss the development of murine model generation in the context of targeted therapy development for the treatment of childhood leukemia, aiming to decrease the attrition rate of progressively complex targeted therapies ranging from small molecules to cell therapy. As more complex therapeutic approaches develop, more complex murine models are needed, to recapitulate closely the human phenotype. Expert opinion: Combining xenograft models for efficacy testing and GEMMs for toxicity testing will be a global approach for pre-clinical testing of complex therapeutics and will contribute to the clinical approval of novel compounds. Finally, this approach is likely to increase clinical approval of novel compounds.

  15. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  16. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  17. System model of a natural circulation integral test facility

    Science.gov (United States)

    Galvin, Mark R.

    The Department of Nuclear Engineering and Radiation Health Physics (NE/RHP) at Oregon State University (OSU) has been developing an innovative modular reactor plant concept since being initiated with a Department of Energy (DoE) grant in 1999. This concept, the Multi-Application Small Light Water Reactor (MASLWR), is an integral pressurized water reactor (PWR) plant that utilizes natural circulation flow in the primary and employs advanced passive safety features. The OSU MASLWR test facility is an electrically heated integral effects facility, scaled from the MASLWR concept design, that has been previously used to assess the feasibility of the concept design safety approach. To assist in evaluating operational scenarios, a simulation tool that models the test facility and is based on both test facility experimental data and analytical methods has been developed. The tool models both the test facility electric core and a simulated nuclear core, allowing evaluation of a broad spectrum of operational scenarios to identify those scenarios that should be explored experimentally using the test facility or design-quality multi-physics tools. Using the simulation tool, the total cost of experimentation and analysis can be reduced by directing time and resources towards the operational scenarios of interest.

  18. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  19. Testing exclusion restrictions and additive separability in sample selection models

    DEFF Research Database (Denmark)

    Huber, Martin; Mellace, Giovanni

    2014-01-01

    Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction of these......Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction...... of these assumptions by applying the approach of Huber and Mellace (Testing instrument validity for LATE identification based on inequality moment constraints, 2011) (for testing instrument validity under treatment endogeneity) to the sample selection framework. We show that the exclusion restriction and additive...... separability imply two testable inequality constraints that come from both point identifying and bounding the outcome distribution of the subpopulation that is always selected/observed. We apply the tests to two variables for which the exclusion restriction is frequently invoked in female wage regressions: non...

  20. A catastrophe theory model of the conflict helix, with tests.

    Science.gov (United States)

    Rummel, R J

    1987-10-01

    Macro social field theory has undergone extensive development and testing since the 1960s. One of these has been the articulation of an appropriate conceptual micro model--called the conflict helix--for understanding the process from conflict to cooperation and vice versa. Conflict and cooperation are viewed as distinct equilibria of forces in a social field; the movement between these equilibria is a jump, energized by a gap between social expectations and power, and triggered by some minor event. Quite independently, there also has been much recent application of catastrophe theory to social behavior, but usually without a clear substantive theory and lacking empirical testing. This paper uses catastrophe theory--namely, the butterfly model--mathematically to structure the conflict helix. The social field framework and helix provide the substantive interpretation for the catastrophe theory; and catastrophe theory provides a suitable mathematical model for the conflict helix. The model is tested on the annual conflict and cooperation between India and Pakistan, 1948 to 1973. The results are generally positive and encouraging.

  1. A Simplified Analytical Modeling of the Hole Erosion Test

    Directory of Open Access Journals (Sweden)

    Mohammed Bezzazi

    2010-01-01

    Full Text Available Problem statement: Internal erosion occurs in soils containing fine particles under the action of high pressure gradients that could result from water discharge. This phenomenon can yield in its final stage to the formation of piping which constitutes a real threat for hydraulics infrastructures as it can precipitate their entire rupture in very short time. In order to mitigate this insidious hazard, it is important to characterize piping dynamics. In this context, the Hole Erosion Test was introduced to assess the erosive features of soils by means of two parameters, the erosion rate and the critical shear stress indicating the beginning of erosion. Modeling this test can enable to understand more comprehensibly the piping phenomenology. Approach: A simplified analytical modeling of the Hole Erosion Test was considered in this study. A closed form solution of erosion taking place during piping was derived without resorting to the habitual cumbersome developments that are needed to achieve complete solution of the rational equations describing this highly coupled problem. This was achieved by assuming formal analogy between the erosive shear stress and the friction shear that develops at a cylindrical piping wall under an axial viscous flow. The flow was assumed to be uniform along the tube. Results: A closed form analytical formula describing erosion dynamics associated to piping was derived. Theoretical predictions were compared with experimental results and the simplified model was found to predict accurately the increase of flow rate that results from piping erosion. Conclusion/Recommendations: The one-dimensional modeling that was proposed for the Hole Erosion Test under strong simplifying assumptions was found to yield the same features as those obtained in the literature by using other approaches. It gives furthermore the dynamics as function of the fluid regime existing inside the tube. In order to get further insight

  2. Model calibration and validation of an impact test simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

    2001-01-01

    This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

  3. ITER CS Model Coil and CS Insert Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Martovetsky, N; Michael, P; Minervina, J; Radovinsky, A; Takayasu, M; Thome, R; Ando, T; Isono, T; Kato, T; Nakajima, H; Nishijima, G; Nunoya, Y; Sugimoto, M; Takahashi, Y; Tsuji, H; Bessette, D; Okuno, K; Ricci, M

    2000-09-07

    The Inner and Outer modules of the Central Solenoid Model Coil (CSMC) were built by US and Japanese home teams in collaboration with European and Russian teams to demonstrate the feasibility of a superconducting Central Solenoid for ITER and other large tokamak reactors. The CSMC mass is about 120 t, OD is about 3.6 m and the stored energy is 640 MJ at 46 kA and peak field of 13 T. Testing of the CSMC and the CS Insert took place at Japan Atomic Energy Research Institute (JAERI) from mid March until mid August 2000. This paper presents the main results of the tests performed.

  4. Fault evolution-test dependency modeling for mechanical systems

    Institute of Scientific and Technical Information of China (English)

    Xiao-dong TAN; Jian-lu LUO; Qing LI; Bing LU; Jing QIU

    2015-01-01

    Tracking the process of fault growth in mechanical systems using a range of tests is important to avoid catastrophic failures. So, it is necessary to study the design for testability (DFT). In this paper, to improve the testability performance of me-chanical systems for tracking fault growth, a fault evolution-test dependency model (FETDM) is proposed to implement DFT. A testability analysis method that considers fault trackability and predictability is developed to quantify the testability performance of mechanical systems. Results from experiments on a centrifugal pump show that the proposed FETDM and testability analysis method can provide guidance to engineers to improve the testability level of mechanical systems.

  5. X-33 Metal Model Testing In Low Turbulence Pressure Tunnel

    Science.gov (United States)

    1997-01-01

    The countrys next generation of space transportation, a reusable launch vehicle (RLV), continues to undergo wind tunnel testing at NASA Langley Research Center, Hampton, Va. All four photos are a metal model of the X-33 reusable launch vehicle (about 15 inches long by 15 inches wide) being tested for Lockheed Martin Skunk Works in the Low Turbulence Pressure Tunnel (LTPT) at NASA Langley Research Center. Tests are being conducted by members of the Aerothermodynamics Branch. According to Kelly Murphy of Langleys Aerothermodynamics Branch, the aluminum and stainless steel model of the X-33 underwent aerodynamic testing in the tunnel. *The subsonic tests were conducted at the speed of Mach 25,* she said. *Force and moment testing and measurement in this tunnel lasted about one week.* Future testing of the metal model is scheduled for Langleys 16-Foot Transonic Tunnel, from the end of March to mid-April 1997, and the Unitary Wind Tunnel, from mid-April to the beginning of May. Other tunnel testing for X-33 models are scheduled from the present through June in the hypersonic tunnels, and the 14- by 22-Foot Tunnel from about mid-June to mid-July. Since 1991 Marshall Space Flight Center in Huntsville, Ala. has been the lead center for coordinating the Agencys X-33 Reusable Launch Vehicle (RLV) Program, an industry-led effort, which NASA Administrator Daniel S. Goldin has declared the agency's highest priority new program. The RLV Technology Program is a partnership among NASA, the United States Air Force and private industry to develop world leadership in low-cost space transportation. The goal of the program is to develop technologies and new operational concepts that can radically reduce the cost of access to space. The RLV program also hopes to speed the commercialization of space and improve U.S. economic competitiveness by making access to space as routine and reliable as today's airline industry, while reducing costs and enhancing safety and reliability. The RLV

  6. Tank Tests of Model 36 Flying Boat Hull

    Science.gov (United States)

    Allison, John

    1938-01-01

    N.A.C.A. Model 36, a hull form with parallel middle body for half the length of the forebody and designed particularly for use with stub wings, was tested according to the general fixed-trim method over the range of practical loads, trims, and speeds. It was also tested free to trim with the center of gravity at two different positions. The results are given in the form of nondimensional coefficients. The resistance at the hump was exceptionally low but, at high planing speeds, afterbody interference made the performance only mediocre.

  7. Numerical Modeling and Test Data Comparison of Propulsion Test Article Helium Pressurization System

    Science.gov (United States)

    Holt, Kimberly; Majumdar, Alok; Steadman, Todd; Hedayat, Ali; Fogle, Frank R. (Technical Monitor)

    2000-01-01

    A transient model of the propulsion test article (PTA) helium pressurization system was developed using the generalized fluid system simulation program (GFSSP). The model included pressurization lines from the facility interface to the engine purge interface and liquid oxygen (lox) and rocket propellant-1 (RP-1) tanks, the propellant tanks themselves including ullage space, and propellant feed lines to their respective pump interfaces. GFSSP's capability was extended to model a control valve to maintain ullage pressure within a specified limit and pressurization processes such as heat transfer between ullage gas, propellant, and the tank wall as well as conduction in the tank wall. The purpose of the model is to predict the flow system characteristics in the entire pressurization system during 80 sec of lower feed system priming, 420 sec of fuel and lox pump priming, and 150 sec of engine firing.

  8. Testing the gravity p-median model empirically

    Directory of Open Access Journals (Sweden)

    Kenneth Carling

    2015-12-01

    Full Text Available Regarding the location of a facility, the presumption in the widely used p-median model is that the customer opts for the shortest route to the nearest facility. However, this assumption is problematic on free markets since the customer is presumed to gravitate to a facility by the distance to and the attractiveness of it. The recently introduced gravity p-median model offers an extension to the p-median model that account for this. The model is therefore potentially interesting, although it has not yet been implemented and tested empirically. In this paper, we have implemented the model in an empirical problem of locating vehicle inspections, locksmiths, and retail stores of vehicle spare-parts for the purpose of investigating its superiority to the p-median model. We found, however, the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

  9. Modeling and Testing Landslide Hazard Using Decision Tree

    Directory of Open Access Journals (Sweden)

    Mutasem Sh. Alkhasawneh

    2014-01-01

    Full Text Available This paper proposes a decision tree model for specifying the importance of 21 factors causing the landslides in a wide area of Penang Island, Malaysia. These factors are vegetation cover, distance from the fault line, slope angle, cross curvature, slope aspect, distance from road, geology, diagonal length, longitude curvature, rugosity, plan curvature, elevation, rain perception, soil texture, surface area, distance from drainage, roughness, land cover, general curvature, tangent curvature, and profile curvature. Decision tree models are used for prediction, classification, and factors importance and are usually represented by an easy to interpret tree like structure. Four models were created using Chi-square Automatic Interaction Detector (CHAID, Exhaustive CHAID, Classification and Regression Tree (CRT, and Quick-Unbiased-Efficient Statistical Tree (QUEST. Twenty-one factors were extracted using digital elevation models (DEMs and then used as input variables for the models. A data set of 137570 samples was selected for each variable in the analysis, where 68786 samples represent landslides and 68786 samples represent no landslides. 10-fold cross-validation was employed for testing the models. The highest accuracy was achieved using Exhaustive CHAID (82.0% compared to CHAID (81.9%, CRT (75.6%, and QUEST (74.0% model. Across the four models, five factors were identified as most important factors which are slope angle, distance from drainage, surface area, slope aspect, and cross curvature.

  10. Test of modified BCS model at finite temperature

    CERN Document Server

    Ponomarev, V Yu

    2005-01-01

    A recently suggested modified BCS (MBCS) model has been studied at finite temperature. We show that this approach does not allow the existence of the normal (non-superfluid) phase at any finite temperature. Other MBCS predictions such as a negative pairing gap, pairing induced by heating in closed-shell nuclei, and ``superfluid -- super-superfluid'' phase transition are discussed also. The MBCS model is tested by comparing with exact solutions for the picket fence model. Here, severe violation of the internal symmetry of the problem is detected. The MBCS equations are found to be inconsistent. The limit of the MBCS applicability has been determined to be far below the ``superfluid -- normal'' phase transition of the conventional FT-BCS, where the model performs worse than the FT-BCS.

  11. Testing of a one dimensional model for Field II calibration

    DEFF Research Database (Denmark)

    Bæk, David; Jensen, Jørgen Arendt; Willatzen, Morten

    2008-01-01

    to the calibrated Field II program for 1, 4, and 10 cycle excitations. Two parameter sets were applied for modeling, one real valued Pz27 parameter set, manufacturer supplied, and one complex valued parameter set found in literature, Alguer´o et al. [11]. The latter implicitly accounts for attenuation. Results show......Field II is a program for simulating ultrasound transducer fields. It is capable of calculating the emitted and pulse-echoed fields for both pulsed and continuous wave transducers. To make it fully calibrated a model of the transducer’s electro-mechanical impulse response must be included. We...... examine an adapted one dimensional transducer model originally proposed by Willatzen [9] to calibrate Field II. This model is modified to calculate the required impulse responses needed by Field II for a calibrated field pressure and external circuit current calculation. The testing has been performed...

  12. Perceived game realism: a test of three alternative models.

    Science.gov (United States)

    Ribbens, Wannes

    2013-01-01

    Perceived realism is considered a key concept in explaining the mental processing of media messages and the societal impact of media. Despite its importance, little is known about its conceptualization and dimensional structure, especially with regard to digital games. The aim of this study was to test a six-factor model of perceived game realism comprised of simulational realism, freedom of choice, perceptual pervasiveness, social realism, authenticity, and character involvement and to assess it against an alternative single- and five-factor model. Data were collected from 380 male digital game users who judged the realism of the first-person shooter Half-Life 2 based upon their previous experience with the game. Confirmatory factor analysis was applied to investigate which model fits the data best. The results support the six-factor model over the single- and five-factor solutions. The study contributes to our knowledge of perceived game realism by further developing its conceptualization and measurement.

  13. Stationarity test with a direct test for heteroskedasticity in exchange rate forecasting models

    Science.gov (United States)

    Khin, Aye Aye; Chau, Wong Hong; Seong, Lim Chee; Bin, Raymond Ling Leh; Teng, Kevin Low Lock

    2017-05-01

    Global economic has been decreasing in the recent years, manifested by the greater exchange rates volatility on international commodity market. This study attempts to analyze some prominent exchange rate forecasting models on Malaysian commodity trading: univariate ARIMA, ARCH and GARCH models in conjunction with stationarity test on residual diagnosis direct testing of heteroskedasticity. All forecasting models utilized the monthly data from 1990 to 2015. Given a total of 312 observations, the data used to forecast both short-term and long-term exchange rate. The forecasting power statistics suggested that the forecasting performance of ARIMA (1, 1, 1) model is more efficient than the ARCH (1) and GARCH (1, 1) models. For ex-post forecast, exchange rate was increased from RM 3.50 per USD in January 2015 to RM 4.47 per USD in December 2015 based on the baseline data. For short-term ex-ante forecast, the analysis results indicate a decrease in exchange rate on 2016 June (RM 4.27 per USD) as compared with 2015 December. A more appropriate forecasting method of exchange rate is vital to aid the decision-making process and planning on the sustainable commodities' production in the world economy.

  14. Development and Testing of a Groundwater Management Model for the Faultless Underground Nuclear Test, Central Nevada Test Area

    Energy Technology Data Exchange (ETDEWEB)

    Douglas P. Boyle; Gregg Lamorey; Scott Bassett; Greg Pohll; Jenny Chapman

    2006-01-25

    This document describes the development and application of a user-friendly and efficient groundwater management model of the Central Nevada Test Area (CNTA) and surrounding areas that will allow the U.S. Department of Energy and state personnel to evaluate the impact of future proposed scenarios. The management model consists of a simple hydrologic model within an interactive groundwater management framework. This framework is based on an object user interface that was developed by the U.S. Geological Survey and has been used by the Desert Research Institute researchers and others to couple disparate environmental resource models, manage the necessary temporal and spatial data, and evaluate model results for management decision making. This framework was modified and applied to the CNTA and surrounding Hot Creek Valley. The utility of the management model was demonstrated through the application of hypothetical future scenarios including mineral mining, regional expansion of agriculture, geothermal energy production, and export of water to large urban areas outside the region. While the results from some of the scenarios indicated potential impacts to the region near CNTA and others did not, together they demonstrate the usefulness of the management tool for managers who need to evaluate the impact proposed changes in groundwater use in or near CNTA may have on radionuclide migration.

  15. Testing Geyser Models using Down-vent Data

    Science.gov (United States)

    Wang, C.; Munoz, C.; Ingebritsen, S.; King, E.

    2013-12-01

    Geysers are often studied as an analogue to magmatic volcanoes because both involve the transfer of mass and energy that leads to eruption. Several conceptual models have been proposed to explain geyser eruption, but no definitive test has been performed largely due to scarcity of down-vent data. In this study we compare simulated time histories of pressure and temperature against published data for the Old Faithful geyser in the Yellowstone National Park and new down-vent measurements from geysers in the El Tatio geyser field of northern Chile. We test two major types of geyser models by comparing simulated and field results. In the chamber model, the geyser system is approximated as a fissure-like conduit connected to a subsurface chamber of water and steam. Heat supplied to the chamber causes water to boil and drives geyser eruptions. Here the Navier-Stokes equation is used to simulate the flow of water and steam. In the fracture-zone model, the geyser system is approximated as a saturated fracture zone of high permeability and compressibility, surrounded by rock matrix of relatively low permeability and compressibility. Heat supply from below causes pore water to boil and drives geyser eruption. Here a two-phase form of Darcy's law is assumed to describe the flow of water and steam (Ingebritsen and Rojstaczer, 1993). Both models can produce P-T time histories qualitatively similar to field results, but the simulations are sensitive to assumed parameters. Results from the chamber model are sensitive to the heat supplied to the system and to the width of the conduit, while results from the fracture-zone model are most sensitive to the permeability of the fracture zone and the adjacent wall rocks. Detailed comparison between field and simulated results, such as the phase lag between changes of pressure and temperature, may help to resolve which model might be more realistic.

  16. Testing Geological Models with Terrestrial Antineutrino Flux Measurements

    CERN Document Server

    Dye, Steve

    2009-01-01

    Uranium and thorium are the main heat producing elements in the earth. Their quantities and distributions, which specify the flux of detectable antineutrinos generated by the beta decay of their daughter isotopes, remain unmeasured. Geological models of the continental crust and the mantle predict different quantities and distributions of uranium and thorium. Many of these differences are resolvable with precision measurements of the terrestrial antineutrino flux. This precision depends on both statistical and systematic uncertainties. An unavoidable background of antineutrinos from nuclear reactors typically dominates the systematic uncertainty. This report explores in detail the capability of various operating and proposed geo-neutrino detectors for testing geological models.

  17. Facility for cold flow testing of solid rocket motor models

    Science.gov (United States)

    Bacchus, D. L.; Hill, O. E.; Whitesides, R. Harold

    1992-02-01

    A new cold flow test facility was designed and constructed at NASA Marshall Space Flight Center for the purpose of characterizing the flow field in the port and nozzle of solid propellant rocket motors (SRM's). A National Advisory Committee was established to include representatives from industry, government agencies, and universities to guide the establishment of design and instrumentation requirements for the new facility. This facility design includes the basic components of air storage tanks, heater, submicron filter, quiet control valve, venturi, model inlet plenum chamber, solid rocket motor (SRM) model, exhaust diffuser, and exhaust silencer. The facility was designed to accommodate a wide range of motor types and sizes from small tactical motors to large space launch boosters. This facility has the unique capability of testing ten percent scale models of large boosters such as the new Advanced Solid Rocket Motor (ASRM), at full scale motor Reynolds numbers. Previous investigators have established the validity of studying basic features of solid rocket motor development programs include the acquisition of data to (1) directly evaluate and optimize the design configuration of the propellant grain, insulation, and nozzle; and (2) provide data for validation of the computational fluid dynamics, (CFD), analysis codes and the performance analysis codes. A facility checkout model was designed, constructed, and utilized to evaluate the performance characteristics of the new facility. This model consists of a cylindrical chamber and converging/diverging nozzle with appropriate manifolding to connect it to the facility air supply. It was designed using chamber and nozzle dimensions to simulate the flow in a 10 percent scale model of the ASRM. The checkout model was recently tested over the entire range of facility flow conditions which include flow rates from 9.07 to 145 kg/sec (20 to 320 Ibm/sec) and supply pressure from 5.17 x 10 exp 5 to 8.27 x 10 exp 6 Pa. The

  18. A review of experiments testing the shoving model

    DEFF Research Database (Denmark)

    Hecksher, Tina; Dyre, J. C.

    2015-01-01

    interval over which the relaxation time increases by ten to fifteen decades. In this paper we have compiled all tests of the shoving model known to us. These involve rheological data obtained by different techniques, high-frequency sound-wave data, neutron scattering data for the vibrational mean......According to the shoving model the non-Arrhenius temperature dependence of supercooled liquids' relaxation time (or viscosity) derives from the fact that the high-frequency shear modulus is temperature dependent in the supercooled phase, often increasing a factor of three or four in the temperature...

  19. Testing and Modeling of Machine Properties in Resistance Welding

    DEFF Research Database (Denmark)

    Wu, Pei

    electrode force, and the time of stabilizing does not depend on the level of the force. An additional spring mounted in the welding head improves the machine touching behavior due to a soft electrode application, but this results in longer time of oscillation of the electrode force, especially when......The objective of this work has been to test and model the machine properties including the mechanical properties and the electrical properties in resistance welding. The results are used to simulate the welding process more accurately. The state of the art in testing and modeling machine properties...... in resistance welding has been described based on a comprehensive literature study. The present thesis has been subdivided into two parts: Part I: Mechanical properties of resistance welding machines. Part II: Electrical properties of resistance welding machines. In part I, the electrode force in the squeeze...

  20. Classical tests of general relativity in brane world models

    Energy Technology Data Exchange (ETDEWEB)

    Boehmer, Christian G [Department of Mathematics and Institute of Origins, University College London, Gower Street, London WC1E 6BT (United Kingdom); De Risi, Giuseppe [Dipartimento di Fisica, Universita degli studi di Bari and Istituto Nazionale di Fisica Nucleare, sez. di Bari, Via G. Amendola 173, 70126 Bari (Italy); Harko, Tiberiu [Department of Physics and Center for Theoretical and Computational Physics, University of Hong Kong, Pok Fu Lam Road (Hong Kong); Lobo, Francisco S N, E-mail: c.boehmer@ucl.ac.u, E-mail: giuseppe.derisi@ba.infn.i, E-mail: harko@hkucc.hku.h, E-mail: flobo@cii.fc.ul.p [Centro de Astronomia e Astrofisica da Universidade de Lisboa, Campo Grande, Ed. C8 1749-016 Lisboa (Portugal)

    2010-09-21

    The classical tests of general relativity (perihelion precession, deflection of light and the radar echo delay) are considered for several spherically symmetric static vacuum solutions in brane world models. Generally, the spherically symmetric vacuum solutions of the brane gravitational field equations have properties quite distinct as compared to the standard black hole solutions of general relativity. As a first step a general formalism that facilitates the analysis of general relativistic Solar System tests for any given spherically symmetric metric is developed. It is shown that the existing observational Solar System data on the perihelion shift of Mercury, on the light bending around the Sun (obtained using long-baseline radio interferometry), and ranging to Mars using the Viking lander constrain the numerical values of the parameters of the specific models.

  1. Model year 2010 Honda insight level-1 testing report.

    Energy Technology Data Exchange (ETDEWEB)

    Rask, E.; Bocci, D.; Duoba, M.; Lohse-Busch, H. (Energy Systems)

    2011-03-22

    As a part of the US Department of Energy's Advanced Vehicle Testing Activity (AVTA), a model year 2010 Honda Insight was procured by eTec (Phoenix, AZ) and sent to ANL's Advanced Powertrain Research Facility for the purposes of vehicle-level testing in support of the Advanced Vehicle Testing Activity (AVTA). Data was acquired during testing using non-intrusive sensors, vehicle network information, and facilities equipment (emissions and dynamometer data). Standard drive cycles, performance cycles, steady-state cycles and A/C usage cycles were tested. Much of this data is openly available for download in ANL's Downloadable Dynamometer Database (D3). The major results are shown here in this report. Given the preliminary nature of this assessment, the majority of the testing was done over standard regulatory cycles and seeks to obtain a general overview of how the vehicle performs. These cycles include the US FTP cycle (Urban) and Highway Fuel Economy Test cycle as well as the US06, a more aggressive supplemental regulatory cycle. Data collection for this testing was kept at a fairly high level and includes emissions and fuel measurements from an exhaust emissions bench, high-voltage and accessory current and voltage from a DC power analyzer, and CAN bus data such as engine speed, engine load, and electric machine operation when available. The following sections will seek to explain some of the basic operating characteristics of the MY2010 Insight and provide insight into unique features of its operation and design.

  2. The Standard-Model Extension and Gravitational Tests

    CERN Document Server

    Tasson, Jay D

    2016-01-01

    The Standard-Model Extension (SME) provides a comprehensive effective field-theory framework for the study of CPT and Lorentz symmetry. This work reviews the structure and philosophy of the SME and provides some intuitive examples of symmetry violation. The results of recent gravitational tests performed within the SME are summarized including analysis of results from the Laser Interferometer Gravitational-Wave Observatory (LIGO), sensitivities achieved in short-range gravity experiments, constraints from cosmic-ray data, and results achieved by studying planetary ephemerids. Some proposals and ongoing efforts will also be considered including gravimeter tests, tests of the Weak Equivalence Principle, and antimatter experiments. Our review of the above topics is augmented by several original extensions of the relevant work. We present new examples of symmetry violation in the SME and use the cosmic-ray analysis to place first-ever constraints on 81 additional operators.

  3. Thermal Environmental Testing of NSTAR Engineering Model Ion Thrusters

    Science.gov (United States)

    Rawlin, Vincent K.; Patterson, Michael J.; Becker, Raymond A.

    1999-01-01

    NASA's New Millenium program will fly a xenon ion propulsion system on the Deep Space 1 Mission. Tests were conducted under NASA's Solar Electric Propulsion Technology Applications Readiness (NSTAR) Program with 3 different engineering model ion thrusters to determine thruster thermal characteristics over the NSTAR operating range in a variety of thermal environments. A liquid nitrogen-cooled shroud was used to cold-soak the thruster to -120 C. Initial tests were performed prior to a mature spacecraft design. Those results and the final, severe, requirements mandated by the spacecraft led to several changes to the basic thermal design. These changes were incorporated into a final design and tested over a wide range of environmental conditions.

  4. Dipole model test with one superconducting coil; results analysed

    CERN Document Server

    Durante, M; Ferracin, P; Fessia, P; Gauthier, R; Giloux, C; Guinchard, M; Kircher, F; Manil, P; Milanese, A; Millot, J-F; Muñoz Garcia, J-E; Oberli, L; Perez, J-C; Pietrowicz, S; Rifflet, J-M; de Rijk, G; Rondeaux, F; Todesco, E; Viret, P; Ziemianski, D

    2013-01-01

    This report is the deliverable report 7.3.1 “Dipole model test with one superconducting coil; results analysed “. The report has four parts: “Design report for the dipole magnet”, “Dipole magnet structure tested in LN2”, “Nb3Sn strand procured for one dipole magnet” and “One test double pancake copper coil made”. The 4 report parts show that, although the magnet construction will be only completed by end 2014, all elements are present for a successful completion. Due to the importance of the project for the future of the participants and given the significant investments done by the participants, there is a full commitment to finish the project.

  5. The Standard-Model Extension and Gravitational Tests

    Directory of Open Access Journals (Sweden)

    Jay D. Tasson

    2016-10-01

    Full Text Available The Standard-Model Extension (SME provides a comprehensive effective field-theory framework for the study of CPT and Lorentz symmetry. This work reviews the structure and philosophy of the SME and provides some intuitive examples of symmetry violation. The results of recent gravitational tests performed within the SME are summarized including analysis of results from the Laser Interferometer Gravitational-Wave Observatory (LIGO, sensitivities achieved in short-range gravity experiments, constraints from cosmic-ray data, and results achieved by studying planetary ephemerids. Some proposals and ongoing efforts will also be considered including gravimeter tests, tests of the Weak Equivalence Principle, and antimatter experiments. Our review of the above topics is augmented by several original extensions of the relevant work. We present new examples of symmetry violation in the SME and use the cosmic-ray analysis to place first-ever constraints on 81 additional operators.

  6. Tests of the Standard Electroweak Model at the Energy Frontier

    CERN Document Server

    Hobbs, John D; Willenbrock, Scott

    2010-01-01

    In this review, we summarize tests of standard electroweak (EW) theory at the highest available energies as a precursor to the Large Hadron Collider (LHC) era. Our primary focus is on the published results from proton-antiproton collisions at $\\sqrt{s}=1.96$ TeV at the Fermilab Tevatron collected using the CDF and D0 detectors. This review is very timely since the LHC scientific program is nearly underway with the first high-energy ($\\sqrt{s}=7$ TeV) collisions about to begin. After presenting an overview of the EW sector of the standard model, we provide a summary of current experimental tests of EW theory. These include gauge boson properties and self-couplings, tests of EW physics from top quark sector, and searches for the Higgs boson.

  7. Dipole model test with one superconducting coil: results analysed

    CERN Document Server

    Bajas, H; Benda, V; Berriaud, C; Bajko, M; Bottura, L; Caspi, S; Charrondiere, M; Clément, S; Datskov, V; Devaux, M; Durante, M; Fazilleau, P; Ferracin, P; Fessia, P; Gauthier, R; Giloux, C; Guinchard, M; Kircher, F; Manil, P; Milanese, A; Millot, J-F; Muñoz Garcia, J-E; Oberli, L; Perez, J-C; Pietrowicz, S; Rifflet, J-M; de Rijk, G; Rondeaux, F; Todesco, E; Viret, P; Ziemianski, D

    2013-01-01

    This report is the deliverable report 7.3.1 “Dipole model test with one superconducting coil; results analysed “. The report has four parts: “Design report for the dipole magnet”, “Dipole magnet structure tested in LN2”, “Nb3Sn strand procured for one dipole magnet” and “One test double pancake copper coil made”. The 4 report parts show that, although the magnet construction will be only completed by end 2014, all elements are present for a successful completion. Due to the importance of the project for the future of the participants and given the significant investments done by the participants, there is a full commitment to finish the project.

  8. Tests of the improved Weiland ion temperature gradient transport model

    Energy Technology Data Exchange (ETDEWEB)

    Kinsey, J.E.; Bateman, G.; Kritz, A.H. [Lehigh Univ., Bethlehem, PA (United States)] [and others

    1996-12-31

    The Weiland theoretically derived transport model for ion temperature gradient and trapped electron modes has been improved to include the effects of parallel ion motion, finite beta, and collisionality. The model also includes the effects of impurities, fast ions, unequal ion and electron temperatures, and finite Larmor radius. This new model has been implemented in our time-dependent transport code and is used in conjunction with pressure-driven modes and neoclassical theory to predict the radial particle and thermal transport in tokamak plasmas. Simulations of TFTR, DIII-D, and JET L-mode plasmas have been conducted to test how the new effects change the predicted density and temperature profiles. Comparisons are made with results obtained using the previous version of the model which was successful in reproducing experimental data from a wide variety of tokamak plasmas. Specifically, the older model has been benchmarked against over 50 discharges from at least 7 different tokamaks including L-mode scans in current, heating power, density, and dimensionless scans in normalized gyro-radius, collisionality, and beta. We have also investigated the non-diffusive elements included in the Weiland model, particularly the particle pinch in order to characterize its behavior. This is partly motivated by recent simulations of ITER. In those simulations, the older Weiland model predicted a particle pinch and ignition was more easily obtained.

  9. Testing galaxy formation models with galaxy stellar mass functions

    Science.gov (United States)

    Lim, S. H.; Mo, H. J.; Lan, Ting-Wen; Ménard, Brice

    2016-10-01

    We compare predictions of a number of empirical models and numerical simulations of galaxy formation to the conditional stellar mass functions (CSMF) of galaxies in groups of different masses obtained recently by Lan et al. to test how well different models accommodate the data. The observational data clearly prefer a model in which star formation in low-mass halos changes behavior at a characteristic redshift zc ˜ 2. There is also tentative evidence that this characteristic redshift depends on environment, becoming zc ˜ 4 in regions that eventually evolve into rich clusters of galaxies. The constrained model is used to understand how galaxies form and evolve in dark matter halos, and to make predictions for other statistical properties of the galaxy population, such as the stellar mass functions of galaxies at high z, the star formation and stellar mass assembly histories in dark matter halos. A comparison of our model predictions with those of other empirical models shows that different models can make vastly different predictions, even though all of them are tuned to match the observed stellar mass functions of galaxies.

  10. Testing galaxy formation models with galaxy stellar mass functions

    Science.gov (United States)

    Lim, S. H.; Mo, H. J.; Lan, T.-W.; Ménard, B.

    2017-01-01

    We compare predictions of a number of empirical models and numerical simulations of galaxy formation to the conditional stellar mass functions of galaxies in groups of different masses obtained recently by Lan et al. to test how well different models accommodate the data. The observational data clearly prefer a model in which star formation in low-mass haloes changes behaviour at a characteristic redshift zc ˜ 2. There is also tentative evidence that this characteristic redshift depends on environment, becoming zc ˜ 4 in regions that eventually evolve into rich clusters of galaxies. The constrained model is used to understand how galaxies form and evolve in dark matter haloes, and to make predictions for other statistical properties of the galaxy population, such as the stellar mass functions of galaxies at high z, the star formation, and stellar mass assembly histories in dark matter haloes. A comparison of our model predictions with those of other empirical models shows that different models can make vastly different predictions, even though all of them are tuned to match the observed stellar mass functions of galaxies.

  11. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions.

  12. Xenograft model for therapeutic drug testing in recurrent respiratory papillomatosis.

    Science.gov (United States)

    Ahn, Julie; Bishop, Justin A; Akpeng, Belinda; Pai, Sara I; Best, Simon R A

    2015-02-01

    Identifying effective treatment for papillomatosis is limited by a lack of animal models, and there is currently no preclinical model for testing potential therapeutic agents. We hypothesized that xenografting of papilloma may facilitate in vivo drug testing to identify novel treatment options. A biopsy of fresh tracheal papilloma was xenografted into a NOD-scid-IL2Rgamma(null) (NSG) mouse. The xenograft began growing after 5 weeks and was serially passaged over multiple generations. Each generation showed a consistent log-growth pattern, and in all xenografts, the presence of the human papillomavirus (HPV) genome was confirmed by polymerase chain reaction (PCR). Histopathologic analysis demonstrated that the squamous architecture of the original papilloma was maintained in each generation. In vivo drug testing with bevacizumab (5 mg/kg i.p. twice weekly for 3 weeks) showed a dramatic therapeutic response compared to saline control. We report here the first successful case of serial xenografting of a tracheal papilloma in vivo with a therapeutic response observed with drug testing. In severely immunocompromised mice, the HPV genome and squamous differentiation of the papilloma can be maintained for multiple generations. This is a feasible approach to identify therapeutic agents in the treatment of recurrent respiratory papillomatosis. © The Author(s) 2014.

  13. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    Science.gov (United States)

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency (JRCD) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  14. Mathematical modelling of the intravenous glucose tolerance test.

    Science.gov (United States)

    De Gaetano, A; Arino, O

    2000-02-01

    Several attempts at building a satisfactory model of the glucose-insulin system are recorded in the literature. The minimal model, which is the model currently mostly used in physiological research on the metabolism of glucose, was proposed in the early eighties for the interpretation of the glucose and insulin plasma concentrations following the intravenous glucose tolerance test. It is composed of two parts: the first consists of two differential equations and describes the glucose plasma concentration time-course treating insulin plasma concentration as a known forcing function; the second consists of a single equation and describes the time course of plasma insulin concentration treating glucose plasma concentration as a known forcing function. The two parts are to be separately estimated on the available data. In order to study glucose-insulin homeostasis as a single dynamical system, a unified model would be desirable. To this end, the simple coupling of the original two parts of the minimal model is not appropriate, since it can be shown that, for commonly observed combinations of parameter values, the coupled model would not admit an equilibrium and the concentration of active insulin in the "distant" compartment would be predicted to increase without bounds. For comparison, a simple delay-differential model is introduced, is demonstrated to be globally asymptotically stable around a unique equilibrium point corresponding to the pre-bolus conditions, and is shown to have positive and bounded solutions for all times. The results of fitting the delay-differential model to experimental data from ten healthy volunteers are also shown. It is concluded that a global unified model is both theoretically desirable and practically usable, and that any such model ought to undergo formal analysis to establish its appropriateness and to exclude conflicts with accepted physiological notions.

  15. Testing biomechanical models of human lumbar lordosis variability.

    Science.gov (United States)

    Castillo, Eric R; Hsu, Connie; Mair, Ross W; Lieberman, Daniel E

    2017-05-01

    Lumbar lordosis (LL) is a key adaptation for bipedalism, but factors underlying curvature variations remain unclear. This study tests three biomechanical models to explain LL variability. Thirty adults (15 male, 15 female) were scanned using magnetic resonance imaging (MRI), a standing posture analysis was conducted, and lumbar range of motion (ROM) was assessed. Three measures of LL were compared. The trunk's center of mass was estimated from external markers to calculate hip moments (Mhip ) and lumbar flexion moments. Cross-sectional areas of lumbar vertebral bodies and trunk muscles were measured from scans. Regression models tested associations between LL and the Mhip moment arm, a beam bending model, and an interaction between relative trunk strength (RTS) and ROM. Hip moments were not associated with LL. Beam bending was moderately predictive of standing but not supine LL (R(2)  = 0.25). Stronger backs and increased ROM were associated with greater LL, especially when standing (R(2)  = 0.65). The strength-flexibility model demonstrates the differential influence of RTS depending on ROM: individuals with high ROM exhibited the most LL variation with RTS, while those with low ROM showed reduced LL regardless of RTS. Hip moments appear constrained suggesting the possibility of selection, and the beam model explains some LL variability due to variations in trunk geometry. The strength-flexibility interaction best predicted LL, suggesting a tradeoff in which ROM limits the effects of back strength on LL. The strength-flexibility model may have clinical relevance for spinal alignment and pathology. This model may also suggest that straight-backed Neanderthals had reduced lumbar mobility. © 2017 Wiley Periodicals, Inc.

  16. Force Limited Random Vibration Test of TESS Camera Mass Model

    Science.gov (United States)

    Karlicek, Alexandra; Hwang, James Ho-Jin; Rey, Justin J.

    2015-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a spaceborne instrument consisting of four wide field-of-view-CCD cameras dedicated to the discovery of exoplanets around the brightest stars. As part of the environmental testing campaign, force limiting was used to simulate a realistic random vibration launch environment. While the force limit vibration test method is a standard approach used at multiple institutions including Jet Propulsion Laboratory (JPL), NASA Goddard Space Flight Center (GSFC), European Space Research and Technology Center (ESTEC), and Japan Aerospace Exploration Agency (JAXA), it is still difficult to find an actual implementation process in the literature. This paper describes the step-by-step process on how the force limit method was developed and applied on the TESS camera mass model. The process description includes the design of special fixtures to mount the test article for properly installing force transducers, development of the force spectral density using the semi-empirical method, estimation of the fuzzy factor (C2) based on the mass ratio between the supporting structure and the test article, subsequent validating of the C2 factor during the vibration test, and calculation of the C.G. accelerations using the Root Mean Square (RMS) reaction force in the spectral domain and the peak reaction force in the time domain.

  17. Testing Cosmological Models with Type Ic Super Luminous Supernovae

    CERN Document Server

    Wei, Jun-Jie; Melia, Fulvio

    2015-01-01

    The use of type Ic Super Luminous Supernovae (SLSN Ic) to examine the cosmological expansion introduces a new standard ruler with which to test theoretical models. The sample suitable for this kind of work now includes 11 SLSNe Ic, which have thus far been used solely in tests involving $\\Lambda$CDM. In this paper, we broaden the base of support for this new, important cosmic probe by using these observations to carry out a one-on-one comparison between the $R_{\\rm h}=ct$ and $\\Lambda$CDM cosmologies. We individually optimize the parameters in each cosmological model by minimizing the $\\chi^{2}$ statistic. We also carry out Monte Carlo simulations based on these current SLSN Ic measurements to estimate how large the sample would have to be in order to rule out either model at a $\\sim 99.7\\%$ confidence level. The currently available sample indicates a likelihood of $\\sim$$70-80\\%$ that the $R_{\\rm h}=ct$ Universe is the correct cosmology versus $\\sim$$20-30\\%$ for the standard model. These results are suggest...

  18. Numerical modelling of sandstone uniaxial compression test using a mix-mode cohesive fracture model

    CERN Document Server

    Gui, Yilin; Kodikara, Jayantha

    2015-01-01

    A mix-mode cohesive fracture model considering tension, compression and shear material behaviour is presented, which has wide applications to geotechnical problems. The model considers both elastic and inelastic displacements. Inelastic displacement comprises fracture and plastic displacements. The norm of inelastic displacement is used to control the fracture behaviour. Meantime, a failure function describing the fracture strength is proposed. Using the internal programming FISH, the cohesive fracture model is programmed into a hybrid distinct element algorithm as encoded in Universal Distinct Element Code (UDEC). The model is verified through uniaxial tension and direct shear tests. The developed model is then applied to model the behaviour of a uniaxial compression test on Gosford sandstone. The modelling results indicate that the proposed cohesive fracture model is capable of simulating combined failure behaviour applicable to rock.

  19. Testing simulation and structural models with applications to energy demand

    Science.gov (United States)

    Wolff, Hendrik

    2007-12-01

    This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality

  20. Simulating microbial denitrification with EPIC: Model description and initial testing

    Energy Technology Data Exchange (ETDEWEB)

    Izaurralde, Roberto C.; Mcgill, William B.; Williams, Jimmy R.; Jones, Curtis D.; Link, Robert P.; Manowitz, D.; Schwab, D. E.; Zhang, Xuesong; Robertson, G. P.; Milar, Neville

    2017-09-01

    Microbial denitrification occurs in anaerobic soil microsites and aquatic environments leading to production of N2O and N2 gases, which eventually escape to the atmosphere. Atmospheric concentrations of N2O have been on the rise since the beginning of the industrial revolution due to large-scale manipulations of the N cycle in managed ecosystems, especially the use of synthetic nitrogenous fertilizer. Here we document and test a microbial denitrification model identified as IMWJ and implemented as a submodel in the EPIC terrestrial ecosystem model. The IMWJ model is resolved on an hourly time step using the concept that C oxidation releases electrons that drive a demand for electron acceptors such as O2 and oxides of N (NO3-, NO2-, and N2O). A spherical diffusion approach is used to describe O2 transport to microbial surfaces while a cylindrical diffusion method is employed to depict O2 transport to root surfaces. Oxygen uptake by microbes and roots is described with Michaelis-Menten kinetic equations. If insufficient O2 is present to accept all electrons generated, the deficit for electron acceptors may be met by oxides of nitrogen, if available. The movement of O2, CO2 and N2O through the soil profile is modeled using the gas transport equation solved on hourly or sub-hourly time steps. Bubbling equations also move N2O and N2 through the liquid phase to the soil surface under highly anaerobic conditions. We used results from a 2-yr field experiment conducted in 2007 and 2008 at a field site in southwest Michigan to test the ability of EPIC, with the IMWJ option, to capture the non-linear response of N2O fluxes as a function of increasing rates of N application to maize [Zea mays L.]. Nitrous oxide flux, soil inorganic N, and ancillary data from 2007 were used for EPIC calibration while 2008 data were used for independent model validation. Overall, EPIC reproduced well the timing and magnitude of N2O fluxes and NO3- mass in surficial soil layers after N

  1. Rare B decays as tests of the Standard Model

    Science.gov (United States)

    Blake, Thomas; Lanfranchi, Gaia; Straub, David M.

    2017-01-01

    One of the most interesting puzzles in particle physics today is that new physics is expected at the TeV energy scale to solve the hierarchy problem, and stabilises the Higgs mass, but so far no unambiguous signal of new physics has been found. Strong constraints on the energy scale of new physics can be derived from precision tests of the electroweak theory and from flavour-changing or CP-violating processes in strange, charm and beauty hadron decays. Decays that proceed via flavour-changing-neutral-current processes are forbidden at the lowest perturbative order in the Standard Model and are, therefore, rare. Rare b hadron decays are playing a central role in the understanding of the underlying patterns of Standard Model physics and in setting up new directions in model building for new physics contributions. In this article the status and prospects of this field are reviewed.

  2. Relevant Criteria for Testing the Quality of Turbulence Models

    DEFF Research Database (Denmark)

    Frandsen, Sten; Jørgensen, Hans E.; Sørensen, John Dalsgaard

    2007-01-01

    turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60m the gust factor increases with wind speed. For heights larger the 60-80m, present assumptions on the value of the gust factor are significantly conservative, both for 3......Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approx. 10......% smaller than the IEC model, for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3sec and 10sec pre-averaging of wind speed data are relevant for MW-size wind...

  3. Rare $B$ Decays as Tests of the Standard Model

    CERN Document Server

    Blake, Thomas; Straub, David M

    2016-01-01

    One of the most interesting puzzles in particle physics today is that new physics is expected at the TeV energy scale to solve the hierarchy problem, and stabilise the Higgs mass, but so far no unambiguous signal of new physics has been found. Strong constraints on the energy scale of new physics can be derived from precision tests of the electroweak theory and from flavour-changing or CP-violating processes in strange, charm and beauty hadron decays. Decays that proceed via flavour-changing-neutral-current processes are forbidden at the lowest perturbative order in the Standard Model and are, therefore, rare. Rare $b$ hadron decays are playing a central role in the understanding of the underlying patterns of Standard Model physics and in setting up new directions in model building for new physics contributions. In this article the status and prospects of this field are reviewed.

  4. Testing a Model of Work Performance in an Academic Environment

    Directory of Open Access Journals (Sweden)

    B. Charles Tatum

    2012-04-01

    Full Text Available In modern society, people both work and study. The intersection between organizational and educational research suggests that a common model should apply to both academic and job performance. The purpose of this study was to apply a model of work and job performance (based on general expectancy theory to a classroom setting, and test the predicted relationships using a causal/path model methodology. The findings revealed that motivation and ability predicted student expectations and self-efficacy, and that expectations and efficacy predicted class performance. Limitations, implications, and future research directions are discussed. This study showed how the research in industrial and organizational psychology is relevant to education. It was concluded that greater effort should be made to integrate knowledge across a wider set of domains.

  5. Large-Scale Tests of the DGP Model

    CERN Document Server

    Song, Y S; Hu, W; Song, Yong-Seon; Sawicki, Ignacy; Hu, Wayne

    2006-01-01

    The self-accelerating braneworld model (DGP) can be tested from measurements of the expansion history of the universe and the formation of structure. Current constraints on the expansion history from supernova luminosity distances, the CMB, and the Hubble constant exclude the simplest flat DGP model at about 3sigma. The best-fit open DGP model is, however, only a marginally poorer fit to the data than flat LCDM. Its substantially different expansion history raises structure formation challenges for the model. A dark-energy model with the same expansion history would predict a highly significant discrepancy with the baryon oscillation measurement due the high Hubble constant required and a large enhancement of CMB anisotropies at the lowest multipoles due to the ISW effect. For the DGP model to satisfy these constraints new gravitational phenomena would have to appear at the non-linear and cross-over scales respectively. A prediction of the DGP expansion history in a region where the phenomenology is well unde...

  6. Using Radiocarbon to Test Models of Ecosystem Carbon Cycling

    Science.gov (United States)

    Trumbore, S.; Lin, H.; Randerson, J.

    2007-05-01

    The radiocarbon content of carbon stored in and respired by ecosystems provides a direct measure of ecosystem carbon dynamics that can be directly compared to model predictions. Because carbon cycles through ecosystems on a variety of timescales, the mean age of C in standing biomass and soil organic matter pools is older than the mean age of microbially respired carbon. In turn, each pathway for C transit through ecosystems my respond differently to edaphic conditions; for example, soil organic matter mean age is controlled by factors affecting stabilization of C on very long timescales, such as mineralogy, while a factor like litter quality that effects decomposition rates reflects vegetation and climate characteristics. We compare the radiocarbon signature of heterotrophically respired CO2 across a number of ecosystems with models predicted using the CASA ecosystem model. The major controls of microbially respired CO2 from ecosystems include the residence time of C in living plant pools (i.e. the age of C in litter inputs to soil) and factors that control decomposition rates (litter quality and climate). Major differences between model and measured values at low latitudes are related to how woody debris pools are treated differently in models and measurements. The time lag between photosynthesis and respiration is a key ecosystem property that defines its potential to store or release carbon given variations in annual net primary production. Radiocarbon provides a rare case where models can be directly compared with measurements to provide a test of this parameter.

  7. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  8. Combinatorial QSAR modeling of chemical toxicants tested against Tetrahymena pyriformis.

    Science.gov (United States)

    Zhu, Hao; Tropsha, Alexander; Fourches, Denis; Varnek, Alexandre; Papa, Ester; Gramatica, Paola; Oberg, Tomas; Dao, Phuong; Cherkasov, Artem; Tetko, Igor V

    2008-04-01

    Selecting most rigorous quantitative structure-activity relationship (QSAR) approaches is of great importance in the development of robust and predictive models of chemical toxicity. To address this issue in a systematic way, we have formed an international virtual collaboratory consisting of six independent groups with shared interests in computational chemical toxicology. We have compiled an aqueous toxicity data set containing 983 unique compounds tested in the same laboratory over a decade against Tetrahymena pyriformis. A modeling set including 644 compounds was selected randomly from the original set and distributed to all groups that used their own QSAR tools for model development. The remaining 339 compounds in the original set (external set I) as well as 110 additional compounds (external set II) published recently by the same laboratory (after this computational study was already in progress) were used as two independent validation sets to assess the external predictive power of individual models. In total, our virtual collaboratory has developed 15 different types of QSAR models of aquatic toxicity for the training set. The internal prediction accuracy for the modeling set ranged from 0.76 to 0.93 as measured by the leave-one-out cross-validation correlation coefficient ( Q abs2). The prediction accuracy for the external validation sets I and II ranged from 0.71 to 0.85 (linear regression coefficient R absI2) and from 0.38 to 0.83 (linear regression coefficient R absII2), respectively. The use of an applicability domain threshold implemented in most models generally improved the external prediction accuracy but at the same time led to a decrease in chemical space coverage. Finally, several consensus models were developed by averaging the predicted aquatic toxicity for every compound using all 15 models, with or without taking into account their respective applicability domains. We find that consensus models afford higher prediction accuracy for the

  9. Atmospheric resuspension of radionuclides. Model testing using Chernobyl data

    Energy Technology Data Exchange (ETDEWEB)

    Garger, E.; Lev, T.; Talerko, N. [Inst. of Radioecology UAAS, Kiev (Ukraine); Galeriu, D. [Institute of Atomic Physics, Bucharest (Romania); Garland, J. [Consultant (United Kingdom); Hoffman, O.; Nair, S.; Thiessen, K. [SENES, Oak Ridge, TN (United States); Miller, C. [Centre for Disease Control, Atlanta, GA (United States); Mueller, H. [GSF - Inst. fuer Strahlenschultz, Neuherberg (Germany); Kryshev, A. [Moscow State Univ. (Russian Federation)

    1996-10-01

    Resuspension can be an important secondary source of contamination after a release has stopped, as well as a source of contamination for people and areas not exposed to the original release. The inhalation of resuspended radionuclides contributes to the overall dose received by exposed individuals. Based on measurements collected after the Chernobyl accident, Scenario R was developed to provide an opportunity to test existing mathematical models of contamination resuspension. In particular, this scenario provided the opportunity to examine data and test models for atmospheric resuspension of radionuclides at several different locations from the release, to investigate resuspension processes on both local and regional scales, and to investigate the importance of seasonal variations of these processes. Participants in the test exercise were provided with information for three different types of locations: (1) within the 30-km zone, where local resuspension processes are expected to dominate; (2) a large urban location (Kiev) 120 km from the release site, where vehicular traffic is expected to be the dominant mechanism for resuspension; and (3) an agricultural area 40-60 km from the release site, where highly contaminated upwind 'hot spots' are expected to be important. Input information included characteristics of the ground contamination around specific sites, climatological data for the sites, characteristics of the terrain and topography, and locations of the sampling sites. Participants were requested to predict the average (quarterly and yearly) concentrations of 137 Cs in air at specified locations due to resuspension of Chernobyl fallout; predictions for 90 Sr and 239 + 240 Pu were also requested for one location and time point. Predictions for specified resuspension factors and rates were also requested. Most participants used empirical models for the resuspension factor as a function of time K(t), as opposed to process-based models. While many of

  10. An improved cognitive model of the Iowa and Soochow Gambling Tasks with regard to model fitting performance and tests of parameter consistency

    Science.gov (United States)

    Dai, Junyi; Kerestes, Rebecca; Upton, Daniel J.; Busemeyer, Jerome R.; Stout, Julie C.

    2015-01-01

    The Iowa Gambling Task (IGT) and the Soochow Gambling Task (SGT) are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning (EVL) model and the prospect valence learning (PVL) model, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79) and 27 control participants (mean age 35; SD 10.44) completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models. PMID:25814963

  11. An Improved Cognitive Model of the Iowa and Soochow Gambling Tasks With Regard to Model Fitting Performance and Tests of Parameter Consistency

    Directory of Open Access Journals (Sweden)

    Junyi eDai

    2015-03-01

    Full Text Available The Iowa Gambling Task (IGT and the Soochow Gambling Task (SGT are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning model (EVL and the prospect valence learning model (PVL, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79 and 27 control participants (mean age 35; SD 10.44 completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models.

  12. Potential Worst-case System for Testing EMI Filters Tested on Simple Filter Models

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2008-09-01

    Full Text Available This paper deals with the approximate worst-case test method for testing the insertion loss of the EMI filters. The systems with 0.1 Ω and 100 Ω impedances are usually used for this testing. These systems are required by the international CISPR 17 standard. The main disadvantage of this system is the use of two impedance transformers. Especially the impedance transformer with 0.1 Ω output impedance is not easy to be produced. These transformers have usually narrow bandwidth. This paper discusses the alternative system with 1 Ω and 100 Ω impedances. The performance of these systems was tested on several filters’ models and the obtained data are depicted, too. The performance comparison of several filters in several systems is also included. The performance of alternate worst-case system is discussed in the conclusion.

  13. Numerical modeling of Thermal Response Tests in Energy Piles

    Science.gov (United States)

    Franco, A.; Toledo, M.; Moffat, R.; Herrera, P. A.

    2013-05-01

    Nowadays, thermal response tests (TRT) are used as the main tools for the evaluation of low enthalpy geothermal systems such as heat exchangers. The results of TRT are used for estimating thermal conductivity and thermal resistance values of those systems. We present results of synthetic TRT simulations that model the behavior observed in an experimental energy pile system, which was installed at the new building of the Faculty of Engineering of Universidad de Chile. Moreover, we also present a parametric study to identify the most influent parameters in the performance of this type of tests. The modeling was developed using the finite element software COMSOL Multiphysics, which allows the incorporation of flow and heat transport processes. The modeled system consists on a concrete pile with 1 m diameter and 28 m deep, which contains a 28 mm diameter PEX pipe arranged in a closed circuit. Three configurations were analyzed: a U pipe, a triple U and a helicoid shape implemented at the experimental site. All simulations were run considering transient response in a three-dimensional domain. The simulation results provided the temperature distribution on the pile for a set of different geometry and physical properties of the materials. These results were compared with analytical solutions which are commonly used to interpret TRT data. This analysis demonstrated that there are several parameters that affect the system response in a synthetic TRT. For example, the diameter of the simulated pile affects the estimated effective thermal conductivity of the system. Moreover, the simulation results show that the estimated thermal conductivity for a 1 m diameter pile did not stabilize even after 100 hours since the beginning of the test, when it reached a value 30% below value used to set up the material properties in the simulation. Furthermore, we observed different behaviors depending on the thermal properties of concrete and soil. According to the simulations, the thermal

  14. Air injection test on a Kaplan turbine: prototype - model comparison

    Science.gov (United States)

    Angulo, M.; Rivetti, A.; Díaz, L.; Liscia, S.

    2016-11-01

    Air injection is a very well-known resource to reduce pressure pulsation magnitude in turbines, especially on Francis type. In the case of large Kaplan designs, even when not so usual, it could be a solution to mitigate vibrations arising when tip vortex cavitation phenomenon becomes erosive and induces structural vibrations. In order to study this alternative, aeration tests were performed on a Kaplan turbine at model and prototype scales. The research was focused on efficiency of different air flow rates injected in reducing vibrations, especially at the draft tube and the discharge ring and also in the efficiency drop magnitude. It was found that results on both scales presents the same trend in particular for vibration levels at the discharge ring. The efficiency drop was overestimated on model tests while on prototype were less than 0.2 % for all power output. On prototype, air has a beneficial effect in reducing pressure fluctuations up to 0.2 ‰ of air flow rate. On model high speed image computing helped to quantify the volume of tip vortex cavitation that is strongly correlated with the vibration level. The hydrophone measurements did not capture the cavitation intensity when air is injected, however on prototype, it was detected by a sonometer installed at the draft tube access gallery.

  15. Non-linear model for compression tests on articular cartilage.

    Science.gov (United States)

    Grillo, Alfio; Guaily, Amr; Giverso, Chiara; Federico, Salvatore

    2015-07-01

    Hydrated soft tissues, such as articular cartilage, are often modeled as biphasic systems with individually incompressible solid and fluid phases, and biphasic models are employed to fit experimental data in order to determine the mechanical and hydraulic properties of the tissues. Two of the most common experimental setups are confined and unconfined compression. Analytical solutions exist for the unconfined case with the linear, isotropic, homogeneous model of articular cartilage, and for the confined case with the non-linear, isotropic, homogeneous model. The aim of this contribution is to provide an easily implementable numerical tool to determine a solution to the governing differential equations of (homogeneous and isotropic) unconfined and (inhomogeneous and isotropic) confined compression under large deformations. The large-deformation governing equations are reduced to equivalent diffusive equations, which are then solved by means of finite difference (FD) methods. The solution strategy proposed here could be used to generate benchmark tests for validating complex user-defined material models within finite element (FE) implementations, and for determining the tissue's mechanical and hydraulic properties from experimental data.

  16. Applications of abduction: hypothesis testing of neuroendocrinological qualitative compartmental models.

    Science.gov (United States)

    Menzies, T; Compton, P

    1997-06-01

    It is difficult to assess hypothetical models in poorly measured domains such as neuroendocrinology. Without a large library of observations to constrain inference, the execution of such incomplete models implies making assumptions. Mutually exclusive assumptions must be kept in separate worlds. We define a general abductive multiple-worlds engine that assesses such models by (i) generating the worlds and (ii) tests if these worlds contain known behaviour. World generation is constrained via the use of relevant envisionment. We describe QCM, a modeling language for compartmental models that can be processed by this inference engine. This tool has been used to find faults in theories published in international refereed journals; i.e. QCM can detect faults which are invisible to other methods. The generality and computational limits of this approach are discussed. In short, this approach is applicable to any representation that can be compiled into an and-or graph, provided the graphs are not too big or too intricate (fanout < 7).

  17. Test characteristics from latent-class models of the California Mastitis Test.

    Science.gov (United States)

    Sanford, C J; Keefe, G P; Sanchez, J; Dingwell, R T; Barkema, H W; Leslie, K E; Dohoo, I R

    2006-11-17

    We evaluated (using latent-class models) the ability of the California Mastitis Test (CMT) to identify cows with intramammary infections on the day of dry-off. The positive and negative predictive values of this test to identify cows requiring dry-cow antibiotics (i.e. infected) was also assessed. We used 752 Holstein-Friesian cows from 11 herds for this investigation. Milk samples were collected for bacteriology, and the CMT was performed cow-side, prior to milking on the day of dry-off. At the cow-level, the sensitivity and specificity of the CMT (using the four quarter results interpreted in parallel) for identifying all pathogens were estimated at 70 and 48%, respectively. If only major pathogens were considered the sensitivity of the CMT increased to 86%. The negative predictive value of the CMT was >95% for herds with major-pathogen intramammary-infection prevalence CMT.

  18. Boron-10 ABUNCL Prototype Models And Initial Active Testing

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Ely, James H.; Lintereur, Azaree T.; Siciliano, Edward R.

    2013-04-23

    The Department of Energy Office of Nuclear Safeguards and Security (NA-241) is supporting the project Coincidence Counting With Boron-Based Alternative Neutron Detection Technology at Pacific Northwest National Laboratory (PNNL) for the development of a 3He proportional counter alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a system based upon 10B-lined proportional tubes in a configuration typical for 3He-based coincidence counter applications. This report provides results from MCNPX model simulations and initial testing of the active mode variation of the Alternative Boron-Based Uranium Neutron Coincidence Collar (ABUNCL) design built by General Electric Reuter-Stokes. Initial experimental testing of the as-delivered passive ABUNCL was previously reported.

  19. A Monte Carlo Simulation Framework for Testing Cosmological Models

    Directory of Open Access Journals (Sweden)

    Heymann Y.

    2014-10-01

    Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.

  20. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  1. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  2. Testing and Modeling of Mechanical Characteristics of Resistance Welding Machines

    DEFF Research Database (Denmark)

    Wu, Pei; Zhang, Wenqi; Bay, Niels;

    2003-01-01

    The dynamic mechanical response of resistance welding machine is very important to the weld quality in resistance welding especially in projection welding when collapse or deformation of work piece occurs. It is mainly governed by the mechanical parameters of machine. In this paper, a mathematical...... for both upper and lower electrode systems. This has laid a foundation for modeling the welding process and selecting the welding parameters considering the machine factors. The method is straightforward and easy to be applied in industry since the whole procedure is based on tests with no requirements...

  3. Model-based optimization of phased array ultrasonic testing

    Institute of Scientific and Technical Information of China (English)

    Sung-Jin; Song; Hak-Joon; Kim; Suk-Chull; Kang; Sung-Sik; Kang; Kyungcho; Kim; Myung-Ho; Song

    2010-01-01

    Simulation of phased array beams in dovetail and austenitic welds is conducted to optimize the setup of phased array ultrasonic testing(PAUT).To simulate the beam in such material with complex geometry or with characteristic of anisotropy and inhomogeneity, firstly,linear phased multi-Gaussian beam(LPMGB) models are introduced and discussed. Then,in the case of dovetail,wedge is designed to maximize the stable amplitude of the beam along the steering path;in the case of austenitic weld,modified focal law...

  4. Testing the hadro-quarkonium model on the lattice

    CERN Document Server

    Knechtli, Francesco; Bali, Gunnar S; Collins, Sara; Moir, Graham; Söldner, Wolfgang

    2016-01-01

    Recently the LHCb experiment found evidence for the existence of two exotic resonances consisting of $c\\bar{c}uud$ quarks. Among the possible interpretations is the hadro-charmonium model, in which charmonium is bound "within" a light hadron. We test this idea on CLS $N_f$=2+1 lattices using the static formulation for the heavy quarks. We find that the static potential is modified by the presence of a hadron such that it becomes more attractive. The effect is of the order of a few MeV.

  5. A gravitational test of wave reinforcement versus fluid density models

    Science.gov (United States)

    Johnson, Jacqueline Umstead

    1990-10-01

    Spermatozoa, protozoa, and algae form macroscopic patterns somewhat analogous to thermally driven convection cells. These bioconvective patterns have attracted interest in the fluid dynamics community, but whether in all cases these waves were gravity driven was unknown. There are two conflicting theories, one gravity dependent (fluid density model), the other gravity independent (wave reinforcement theory). The primary objectives of the summer faculty fellows were to: (1) assist in sample collection (spermatozoa) and preparation for the KC-135 research airplane experiment; and (2) to collaborate on ground testing of bioconvective variables such as motility, concentration, morphology, etc., in relation to their macroscopic patterns. Results are very briefly given.

  6. Shipboard Medical Backpack: Preproduction Model Test and Evaluation.

    Science.gov (United States)

    1982-06-01

    AD-A20 026 NAVAL OCEAN SYSTEMS CENTER SAN DIEGO CA F/S 6/12 SHIPBOARD MEDICAL BACKPACK : PREPRODUCTION MODEL TEST AND EVALUA--ETC(U) JUN 82 R W...KATAOKA, F R BORKAT UNCLASSIFIED NOSC/TR-737 NL’E EEEEEEEEEEE -EllI///EEEEEE EEEEEEEEEEEghE 00 I-- -4 0 w Technical Report 737 SHIPBOARD MEDICAL BACKPACK ...evaluation of preproduction shipboard medical backpack units by the medical depart- ments of 29 ships during a two-month period from June to August 1981. The

  7. Computer modeling of test particle acceleration at oblique shocks

    Science.gov (United States)

    Decker, Robert B.

    1988-01-01

    The present evaluation of the basic techniques and illustrative results of charged particle-modeling numerical codes suitable for particle acceleration at oblique, fast-mode collisionless shocks emphasizes the treatment of ions as test particles, calculating particle dynamics through numerical integration along exact phase-space orbits. Attention is given to the acceleration of particles at planar, infinitessimally thin shocks, as well as to plasma simulations in which low-energy ions are injected and accelerated at quasi-perpendicular shocks with internal structure.

  8. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...

  9. Computerized Adaptive Testing: A Comparison of the Nominal Response Model and the Three Parameter Logistic Model.

    Science.gov (United States)

    DeAyala, R. J.; Koch, William R.

    A nominal response model-based computerized adaptive testing procedure (nominal CAT) was implemented using simulated data. Ability estimates from the nominal CAT were compared to those from a CAT based upon the three-parameter logistic model (3PL CAT). Furthermore, estimates from both CAT procedures were compared with the known true abilities used…

  10. Modeling Pacing Behavior and Test Speededness Using Latent Growth Curve Models

    Science.gov (United States)

    Kahraman, Nilufer; Cuddy, Monica M.; Clauser, Brian E.

    2013-01-01

    This research explores the usefulness of latent growth curve modeling in the study of pacing behavior and test speededness. Examinee response times from a high-stakes, computerized examination, collected before and after the examination was subjected to a timing change, were analyzed using a series of latent growth curve models to detect…

  11. A field test of a simple stochastic radiative transfer model

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, N. [Science Applications International Corp., San Diego, CA (United States)

    1995-09-01

    The problem of determining the effect of clouds on the radiative energy balance of the globe is of well-recognized importance. One can in principle solve the problem for any given configuration of clouds using numerical techniques. This knowledge is not useful however, because of the amount of input data and computer resources required. Besides, we need only the average of the resulting solution over the grid scale of a general circulation model (GCM). Therefore, we are interested in estimating the average of the solutions of such fine-grained problems using only coarse grained data, a science or art called stochastic radiation transfer. Results of the described field test indicate that the stochastic description is a somewhat better fit to the data than is a fractional cloud cover model, but more data are needed. 1 ref., 3 figs.

  12. Azimuthal anisotropies as stringent test for nuclear transport models

    Science.gov (United States)

    Crochet, P.; Rami, F.; Donà, R.; Coffin, J. P.; Fintz, P.; Guillaume, G.; Jundt, F.; Kuhn, C.; Roy, C.; de Schauenburg, B.; Tizniti, L.; Wagner, P.; Alard, J. P.; Andronic, A.; Basrak, Z.; Bastid, N.; Belyaev, I.; Bendarag, A.; Berek, G.; Best, D.; Biegansky, J.; Buta, A.; Čaplar, R.; Cindro, N.; Dupieux, P.; Dželalija, M.; Fan, Z. G.; Fodor, Z.; Fraysse, L.; Freifelder, R. P.; Gobbi, A.; Herrmann, N.; Hildenbrand, K. D.; Hong, B.; Jeong, S. C.; Kecskemeti, J.; Kirejczyk, M.; Koncz, P.; Korolija, M.; Kotte, R.; Lebedev, A.; Leifels, Y.; Manko, V.; Moisa, D.; Mösner, J.; Neubert, W.; Pelte, D.; Petrovici, M.; Pinkenburg, C.; Reisdorf, W.; Ritman, J. L.; Sadchikov, A. G.; Schüll, D.; Seres, Z.; Sikora, B.; Simion, V.; Siwek-Wilczyńska, K.; Sodan, U.; Teh, K. M.; Trzaska, M.; Wang, G. S.; Wessels, J. P.; Wienold, T.; Wisniewski, K.; Wohlfarth, D.; Zhilin, A.; Hartnack, C.; FOPI Collaboration

    1997-02-01

    Azimuthal distributions of charged particles and intermediate mass fragments emitted in Au+Au collisions at 600 A MeV have been measured using the FOPI facility at GSI-Darmstadt. Data show a strong increase of the in-plane azimuthal anisotropy ratio with the charge of the detected fragment. Intermediate mass fragments are found to exhibit a strong momentum-space alignment with respect of the reaction plane. The experimental results are presented as a function of the polar centre-of-mass angle and over a broad range of impact parameters. They are compared to the predictions of the Isospin Quantum Molecular Dynamics model using three different parametrisations of the equation of state. We show that such highly accurate data provide stringent test for microscopic transport models and can potentially constrain separately the stiffness of the nuclear equation of state and the momentum dependence of the nuclear interaction.

  13. Stereovision vibration measurement test of a masonry building model

    Science.gov (United States)

    Shan, Baohua; Gao, Yunli; Shen, Yu

    2016-04-01

    To monitor 3D deformations of structural vibration response, a stereovision-based 3D deformation measurement method is proposed in paper. The world coordinate system is established on structural surface, and 3D displacement equations of structural vibration response are acquired through coordinate transformation. The algorithms of edge detection, center fitting and matching constraint are developed for circular target. A shaking table test of a masonry building model under Taft and El Centro earthquake at different acceleration peak is performed in lab, 3D displacement time histories of the model are acquired by the integrated stereovision measurement system. In-plane displacement curves obtained by two methods show good agreement, this suggests that the proposed method is reliable for monitoring structural vibration response. Out-of-plane displacement curves indicate that the proposed method is feasible and useful for monitoring 3D deformations of vibration response.

  14. Empirical testing of earthquake recurrence models at source and site

    Science.gov (United States)

    Albarello, D.; Mucciarelli, M.

    2012-04-01

    Several probabilistic procedures are presently available for seismic hazard assessment (PSHA), based on time-dependent or time-independent models. The result is a number of different outcomes (hazard maps), and to take into account the inherent uncertainty (epistemic), the outcomes of alternative procedures are combined in the frame of logic-tree approaches by scoring each procedure as a function of the respective reliability. This is deduced by evaluating ex-ante (by expert judgements) each element concurring in the relevant PSH computational procedure. This approach appears unsatisfactory also because the value of each procedure depends both on the reliability of each concurring element and on that of their combination: thus, checking the correctness of single elements does not allow evaluating the correctness of the procedure as a whole. Alternative approaches should be based 1) on the ex-post empirical testing of the considered PSH computational models and 2) on the validation of the assumptions underlying concurrent models. The first goal can be achieved comparing the probabilistic forecasts provided by each model with empirical evidence relative to seismic occurrences (e.g., strong-motion data or macroseismic intensity evaluations) during some selected control periods of dimension comparable with the relevant exposure time. About assumptions validation, critical issues are the dimension of the minimum data set necessary to distinguish processes with or without memory, the reliability of mixed data on seismic sources (i.e. historical and palaeoseismological), the completeness of fault catalogues. Some results obtained by the application of these testing procedures in Italy will be shortly outlined.

  15. Testing a fall risk model for injection drug users.

    Science.gov (United States)

    Pieper, Barbara; Templin, Thomas N; Goldberg, Allon

    2012-01-01

    Fall risk is a critical component of clinical assessment and has not been examined for persons who have injected illicit drugs and are aging. The aim of this study was to test and develop the Fall Risk Model for Injection Drug Users by examining the relationships among injection drug use, chronic venous insufficiency, lower extremity impairments (i.e., decreased ankle range of motion, reduced calf muscle endurance, and leg pain), age and other covariates, and the Tinetti balance and gait total score as a measure of fall risk. A cross-sectional comparative design was used with four crossed factors. Standardized instruments were used to assess the variables. Moderated multiple regression with linear and quadratic trends in age was used to examine the nature of the relationship between the Tinetti balance and gait total and age and the potential moderating role of injection drug use. A prespecified series of models was tested. Participants (n = 713) were men (46.9%) and women with a mean age of 46.26 years and primarily African American (61.7%) in methadone treatment centers. The fall risk of a 48-year-old leg injector was comparable with the fall risk of a 69-year-old who had not injected drugs. Variables were added to the model sequentially, resulting in some lost significance of some when they were explained by subsequent variables. Final significant variables in the model were employment status, number of comorbidities, ankle range of motion, leg pain, and calf muscle endurance. Fall risk was associated with route of drug use. Lower extremity impairments accounted for the effects of injection drug use and chronic venous insufficiency on risk for falls. Further understanding of fall risk in injection users is necessary as they age, attempt to work, and participate in activities.

  16. Propeller aircraft interior noise model. II - Scale-model and flight-test comparisons

    Science.gov (United States)

    Willis, C. M.; Mayes, W. H.

    1987-01-01

    A program for predicting the sound levels inside propeller driven aircraft arising from sidewall transmission of airborne exterior noise is validated through comparisons of predictions with both scale-model test results and measurements obtained in flight tests on a turboprop aircraft. The program produced unbiased predictions for the case of the scale-model tests, with a standard deviation of errors of about 4 dB. For the case of the flight tests, the predictions revealed a bias of 2.62-4.28 dB (depending upon whether or not the data for the fourth harmonic were included) and the standard deviation of the errors ranged between 2.43 and 4.12 dB. The analytical model is shown to be capable of taking changes in the flight environment into account.

  17. Testing flow diversion in animal models: a systematic review.

    Science.gov (United States)

    Fahed, Robert; Raymond, Jean; Ducroux, Célina; Gentric, Jean-Christophe; Salazkin, Igor; Ziegler, Daniela; Gevry, Guylaine; Darsaut, Tim E

    2016-04-01

    Flow diversion (FD) is increasingly used to treat intracranial aneurysms. We sought to systematically review published studies to assess the quality of reporting and summarize the results of FD in various animal models. Databases were searched to retrieve all animal studies on FD from 2000 to 2015. Extracted data included species and aneurysm models, aneurysm and neck dimensions, type of flow diverter, occlusion rates, and complications. Articles were evaluated using a checklist derived from the Animal Research: Reporting of In Vivo Experiments (ARRIVE) guidelines. Forty-two articles reporting the results of FD in nine different aneurysm models were included. The rabbit elastase-induced aneurysm model was the most commonly used, with 3-month occlusion rates of 73.5%, (95%CI [61.9-82.6%]). FD of surgical sidewall aneurysms, constructed in rabbits or canines, resulted in high occlusion rates (100% [65.5-100%]). FD resulted in modest occlusion rates (15.4% [8.9-25.1%]) when tested in six complex canine aneurysm models designed to reproduce more difficult clinical contexts (large necks, bifurcation, or fusiform aneurysms). Adverse events, including branch occlusion, were rarely reported. There were no hemorrhagic complications. Articles complied with 20.8 ± 3.9 of 41 ARRIVE items; only a small number used randomization (3/42 articles [7.1%]) or a control group (13/42 articles [30.9%]). Preclinical studies on FD have shown various results. Occlusion of elastase-induced aneurysms was common after FD. The model is not challenging but standardized in many laboratories. Failures of FD can be reproduced in less standardized but more challenging surgical canine constructions. The quality of reporting could be improved.

  18. On modeling approach for embedded real-time software simulation testing

    Institute of Scientific and Technical Information of China (English)

    Yin Yongfeng; Liu Bin; Zhong Deming; Jiang Tongrain

    2009-01-01

    Modeling technology has been introduced into software testing field. However, how to carry through the testing modeling effectively is still a difficulty. Based on combination of simulation modeling technology and embedded real-time software testing method, the process of simulation testing modeling is studied first. And then, the supporting environment of simulation testing modeling is put forward. Furthermore, an approach of embedded real-time software simulation testing modeling including modeling of cross-linked equipments of system under testing (SUT), test case, testing scheduling, and testing system service is brought forward. Finally, the formalized description and execution system of testing models are given, with which we can realize real-time, closed loop, and automated system testing for embedded real-time software.

  19. Global test of seismic static stress triggering model

    Institute of Scientific and Technical Information of China (English)

    万永革; 吴忠良; 周公威; 黄静; 秦立新

    2002-01-01

    Seismic static stress triggering model is tested using Harvard centroid moment tensor (CMT) solution catalogue of 1976~2000 and concept of (earthquake doublet(. Result shows that seismic static stress triggering effect does exist in the view of global earthquakes, but the effect is very weak. Dividing the earthquakes into thrust focal mechanism, normal focal mechanism, strike-slip focal mechanism, we find that non-strike-slip focal mechanism earthquakes have significant triggering effect, whereas, the triggering effect in strike-slip focal mechanism earthquakes is not obvious. Divided the subsequent events delay time of (earthquake doublet( into 5 classes of t(1, t<1, t(10, t<10, 1(t(10 (t is in unit of d), then seismic static stress triggering effect does not change with delay time in short time period after earthquakes. The research on seismic static stress triggering in different regions of the world indicates that triggering effect is significant in subduction belts. Seismic static stress triggering model is tested by using (earthquake doublets( in China and its adjacent region. The result indicates that seismic static stress triggering effect cannot be observed easily in China and its adjacent region due to the seismic focal mechanism type (most of the earthquakes are strike-slip earthquakes).

  20. Modeling transient streaming potentials in falling-head permeameter tests.

    Science.gov (United States)

    Malama, Bwalya; Revil, André

    2014-01-01

    We present transient streaming potential data collected during falling-head permeameter tests performed on samples of two sands with different physical and chemical properties. The objective of the work is to estimate hydraulic conductivity (K) and the electrokinetic coupling coefficient (Cl ) of the sand samples. A semi-empirical model based on the falling-head permeameter flow model and electrokinetic coupling is used to analyze the streaming potential data and to estimate K and Cl . The values of K estimated from head data are used to validate the streaming potential method. Estimates of K from streaming potential data closely match those obtained from the associated head data, with less than 10% deviation. The electrokinetic coupling coefficient was estimated from streaming potential vs. (1) time and (2) head data for both sands. The results indicate that, within limits of experimental error, the values of Cl estimated by the two methods are essentially the same. The results of this work demonstrate that a temporal record of the streaming potential response in falling-head permeameter tests can be used to estimate both K and Cl . They further indicate the potential for using transient streaming potential data as a proxy for hydraulic head in hydrogeology applications.

  1. Alcock-Paczynski Test with Model-independent BAO Data

    CERN Document Server

    Melia, Fulvio

    2015-01-01

    Cosmological tests based on the statistical analysis of galaxy distributions are usually dependent on the evolution of the sources. An exception is the Alcock-Paczynski (AP) test, which is based on the changing ratio of angular to spatial/redshift size of (presumed) spherically-symmetric source distributions with distance. Intrinsic redshift distortions due to gravitational effects may also have an influence, but there is now a way to overcome them: with the inclusion in the AP test of an observational signature with a sharp feature, such as the Baryonic Acoustic Oscillation (BAO) peak. Redshift distortions affect only the amplitude of the peak, not its position. As we will show here, the use of this diagnostic, with newly acquired data on the anisotropic distribution of the BAO peaks from SDSS-III/BOSS-DR11 at average redshifts 0.57 and 2.34, strongly disfavours the current concordance (LCDM) model, which is discarded at the 3-sigma level. A statistically acceptable fit to the AP data with wCDM (the version ...

  2. Model Checking Vector Addition Systems with one zero-test

    CERN Document Server

    Bonet, Rémi; Leroux, Jérôme; Zeitoun, Marc

    2012-01-01

    We design a variation of the Karp-Miller algorithm to compute, in a forward manner, a finite representation of the cover (i.e., the downward closure of the reachability set) of a vector addition system with one zero-test. This algorithm yields decision procedures for several problems for these systems, open until now, such as place-boundedness or LTL model-checking. The proof techniques to handle the zero-test are based on two new notions of cover: the refined and the filtered cover. The refined cover is a hybrid between the reachability set and the classical cover. It inherits properties of the reachability set: equality of two refined covers is undecidable, even for usual Vector Addition Systems (with no zero-test), but the refined cover of a Vector Addition System is a recursive set. The second notion of cover, called the filtered cover, is the central tool of our algorithms. It inherits properties of the classical cover, and in particular, one can effectively compute a finite representation of this set, e...

  3. Evaluation of fracture models through pressurized-thermal-shock testing

    Energy Technology Data Exchange (ETDEWEB)

    Pugh, C.E.; Bryan, R.H.; Bass, B.R.; Nanstad, R.K.

    1988-01-01

    Two multiple-transient pressurized-thermal-shock experiments (PTSEs) have been conducted under the NRC-sponsored Heavy-Section Steel Technology (HSST) program. The first test (PTSE-1) employed an SA-508 class 2 steel with high Charpy upper-shelf energy level and a relatively high brittle-to-ductile transition temperature. The second test (PTSE-2) used a 2 1/4 Cr-1 Mo steel (SA-387 grade 22) that had been given a special heat treatment to yield a low Charpy upper-shelf energy level and attendant low tearing resistance. Each experiment included two combined thermal and pressure transients that give rise to propagation and arrest of an initial long flaw that extended about 10% through the thick wall of the test cylinder. Both materials exhibited the ability to inhibit crack propagation by warm prestressing, high initiation toughness values and high crack-arrest toughness values. Cleavage initiation and arrest are modeled well by available fracture theories. However, calculations of ductile tearing based on resistance curves did not consistently predict the observed tearing.

  4. Test and Sensitivity Analysis of Hydrological Modeling in the Coupled WRF-Urban Modeling System

    Science.gov (United States)

    Wang, Z.; yang, J.

    2013-12-01

    Rapid urbanization has emerged as the source of many adverse effects that challenge the environmental sustainability of cities under changing climatic patterns. One essential key to address these challenges is to physically resolve the dynamics of urban-land-atmospheric interactions. To investigate the impact of urbanization on regional climate, physically-based single layer urban canopy model (SLUCM) has been developed and implemented into the Weather Research and Forecasting (WRF) platform. However, due to the lack of realistic representation of urban hydrological processes, simulation of urban climatology by current coupled WRF-SLUCM is inevitably inadequate. Aiming at improving the accuracy of simulations, recently we implemented urban hydrological processes into the model, including (1) anthropogenic latent heat, (2) urban irrigation, (3) evaporation over impervious surface, and (4) urban oasis effect. In addition, we couple the green roof system into the model to verify its capacity in alleviating urban heat island effect at regional scale. Driven by different meteorological forcings, offline tests show that the enhanced model is more accurate in predicting turbulent fluxes arising from built terrains. Though the coupled WRF-SLUCM has been extensively tested against various field measurement datasets, accurate input parameter space needs to be specified for good model performance. As realistic measurements of all input parameters to the modeling framework are rarely possible, understanding the model sensitivity to individual parameters is essential to determine the relative importance of parameter uncertainty to model performance. Thus we further use an advanced Monte Carlo approach to quantify relative sensitivity of input parameters of the hydrological model. In particular, performance of two widely used soil hydraulic models, namely the van Genuchten model (based on generic soil physics) and an empirical model (viz. the CHC model currently adopted in WRF

  5. Carbon Back Sputter Modeling for Hall Thruster Testing

    Science.gov (United States)

    Gilland, James H.; Williams, George J.; Burt, Jonathan M.; Yim, John Tamin

    2016-01-01

    Lifetime requirements for electric propulsion devices, including Hall Effect thrusters, are continually increasing, driven in part by NASA's inclusion of this technology in it's exploration architecture. NASA will demonstrate high-power electric propulsion system on the Solar Electric Propulsion Technology Demonstration Mission (SEP TDM). The Asteroid Redirect Robotic mission is one candidate SEP TDM, which is projected to require tens of thousands of thruster life. As thruster life is increased, for example through the use of improved magnetic field designs, the relative influence of facility effects increases. One such effect is the sputtering and redeposition, or back sputter, of facility materials by the high energy thruster plumes. In support of wear testing for the Hall Effect Rocket with Magnetic Shielding (HERMeS) project, the back sputter from a Hall effect thruster plume has been modeled for the NASA Glenn Research Center's Vacuum Facility 5. The predicted wear at a near-worst case condition of 600 V, 12.5 kW was found to be on the order of 1 micron/kh in a fully carbon-lined chamber. A more detailed numerical Monte Carlo code was also modified to estimate back sputter for a detailed facility and pumping configuration. This code demonstrated similar back sputter rate distributions, but is not yet accurately modeling the magnitudes. The modeling has been benchmarked to recent HERMeS wear testing, using multiple microbalance measurements. These recent measurements have yielded values on the order of 1.5 - 2 micron/kh at 600 V and 12.5 kW.

  6. Meso-scale modeling of irradiated concrete in test reactor

    Energy Technology Data Exchange (ETDEWEB)

    Giorla, A. [Oak Ridge National Laboratory, One Bethel Valley Road, Oak Ridge, TN 37831 (United States); Vaitová, M. [Czech Technical University, Thakurova 7, 166 29 Praha 6 (Czech Republic); Le Pape, Y., E-mail: lepapeym@ornl.gov [Oak Ridge National Laboratory, One Bethel Valley Road, Oak Ridge, TN 37831 (United States); Štemberk, P. [Czech Technical University, Thakurova 7, 166 29 Praha 6 (Czech Republic)

    2015-12-15

    Highlights: • A meso-scale finite element model for irradiated concrete is developed. • Neutron radiation-induced volumetric expansion is a predominant degradation mode. • Confrontation with expansion and damage obtained from experiments is successful. • Effects of paste shrinkage, creep and ductility are discussed. - Abstract: A numerical model accounting for the effects of neutron irradiation on concrete at the mesoscale is detailed in this paper. Irradiation experiments in test reactor (Elleuch et al., 1972), i.e., in accelerated conditions, are simulated. Concrete is considered as a two-phase material made of elastic inclusions (aggregate) subjected to thermal and irradiation-induced swelling and embedded in a cementitious matrix subjected to shrinkage and thermal expansion. The role of the hardened cement paste in the post-peak regime (brittle-ductile transition with decreasing loading rate), and creep effects are investigated. Radiation-induced volumetric expansion (RIVE) of the aggregate cause the development and propagation of damage around the aggregate which further develops in bridging cracks across the hardened cement paste between the individual aggregate particles. The development of damage is aggravated when shrinkage occurs simultaneously with RIVE during the irradiation experiment. The post-irradiation expansion derived from the simulation is well correlated with the experimental data and, the obtained damage levels are fully consistent with previous estimations based on a micromechanical interpretation of the experimental post-irradiation elastic properties (Le Pape et al., 2015). The proposed modeling opens new perspectives for the interpretation of test reactor experiments in regards to the actual operation of light water reactors.

  7. Model test and CFD calculation of a cavitating bulb turbine

    Energy Technology Data Exchange (ETDEWEB)

    Necker, J; Aschenbrenner, T, E-mail: joerg.necker@voith.co [Voith Hydro Holding GmbH and Co. KG Alexanderstrasse 11, 89522 Heidenheim (Germany)

    2010-08-15

    The flow in a horizontal shaft bulb turbine is calculated as a two-phase flow with a commercial Computational Fluid Dynamics (CFD-)-code including cavitation model. The results are compared with experimental results achieved at a closed loop test rig for model turbines. On the model test rig, for a certain operating point (i.e. volume flow, net head, blade angle, guide vane opening) the pressure behind the turbine is lowered (i.e. the Thoma-coefficient {sigma} is lowered) and the efficiency of the turbine is recorded. The measured values can be depicted in a so-called {sigma}-break curve or {eta}- {sigma}-diagram. Usually, the efficiency is independent of the Thoma-coefficient up to a certain value. When lowering the Thoma-coefficient below this value the efficiency will drop rapidly. Visual observations of the different cavitation conditions complete the experiment. In analogy, several calculations are done for different Thoma-coefficients {sigma}and the corresponding hydraulic losses of the runner are evaluated quantitatively. For a low {sigma}-value showing in the experiment significant efficiency loss, the the change of volume flow in the experiment was simulated. Besides, the fraction of water vapour as an indication of the size of the cavitation cavity is analyzed qualitatively. The experimentally and the numerically obtained results are compared and show a good agreement. Especially the drop in efficiency can be calculated with satisfying accuracy. This drop in efficiency is of high practical importance since it is one criterion to determine the admissible cavitation in a bulb-turbine. The visual impression of the cavitation in the CFD-analysis is well in accordance with the observed cavitation bubbles recorded on sketches and/or photographs.

  8. Large animal models for vaccine development and testing.

    Science.gov (United States)

    Gerdts, Volker; Wilson, Heather L; Meurens, Francois; van Drunen Littel-van den Hurk, Sylvia; Wilson, Don; Walker, Stewart; Wheler, Colette; Townsend, Hugh; Potter, Andrew A

    2015-01-01

    The development of human vaccines continues to rely on the use of animals for research. Regulatory authorities require novel vaccine candidates to undergo preclinical assessment in animal models before being permitted to enter the clinical phase in human subjects. Substantial progress has been made in recent years in reducing and replacing the number of animals used for preclinical vaccine research through the use of bioinformatics and computational biology to design new vaccine candidates. However, the ultimate goal of a new vaccine is to instruct the immune system to elicit an effective immune response against the pathogen of interest, and no alternatives to live animal use currently exist for evaluation of this response. Studies identifying the mechanisms of immune protection; determining the optimal route and formulation of vaccines; establishing the duration and onset of immunity, as well as the safety and efficacy of new vaccines, must be performed in a living system. Importantly, no single animal model provides all the information required for advancing a new vaccine through the preclinical stage, and research over the last two decades has highlighted that large animals more accurately predict vaccine outcome in humans than do other models. Here we review the advantages and disadvantages of large animal models for human vaccine development and demonstrate that much of the success in bringing a new vaccine to market depends on choosing the most appropriate animal model for preclinical testing. © The Author 2015. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Some observational tests of a minimal galaxy formation model

    Science.gov (United States)

    Cohn, J. D.

    2017-04-01

    Dark matter simulations can serve as a basis for creating galaxy histories via the galaxy-dark matter connection. Here, one such model by Becker is implemented with several variations on three different dark matter simulations. Stellar mass and star formation rates are assigned to all simulation subhaloes at all times, using subhalo mass gain to determine stellar mass gain. The observational properties of the resulting galaxy distributions are compared to each other and observations for a range of redshifts from 0 to 2. Although many of the galaxy distributions seem reasonable, there are noticeable differences as simulations, subhalo mass gain definitions or subhalo mass definitions are altered, suggesting that the model should change as these properties are varied. Agreement with observations may improve by including redshift dependence in the added-by-hand random contribution to star formation rate. There appears to be an excess of faint quiescent galaxies as well (perhaps due in part to differing definitions of quiescence). The ensemble of galaxy formation histories for these models tend to have more scatter around their average histories (for a fixed final stellar mass) than the two more predictive and elaborate semi-analytic models of Guo et al. and Henriques et al., and require more basis fluctuations (using principal component analysis) to capture 90 per cent of the scatter around their average histories. The codes to plot model predictions (in some cases alongside observational data) are publicly available to test other mock catalogues at https://github.com/jdcphysics/validation/. Information on how to use these codes is in Appendix A.

  10. Induction Heating Model of Cermet Fuel Element Environmental Test (CFEET)

    Science.gov (United States)

    Gomez, Carlos F.; Bradley, D. E.; Cavender, D. P.; Mireles, O. R.; Hickman, R. R.; Trent, D.; Stewart, E.

    2013-01-01

    Deep space missions with large payloads require high specific impulse and relatively high thrust to achieve mission goals in reasonable time frames. Nuclear Thermal Rockets (NTR) are capable of producing a high specific impulse by employing heat produced by a fission reactor to heat and therefore accelerate hydrogen through a rocket nozzle providing thrust. Fuel element temperatures are very high (up to 3000 K) and hydrogen is highly reactive with most materials at high temperatures. Data covering the effects of high-temperature hydrogen exposure on fuel elements are limited. The primary concern is the mechanical failure of fuel elements due to large thermal gradients; therefore, high-melting-point ceramics-metallic matrix composites (cermets) are one of the fuels under consideration as part of the Nuclear Cryogenic Propulsion Stage (NCPS) Advance Exploration System (AES) technology project at the Marshall Space Flight Center. The purpose of testing and analytical modeling is to determine their ability to survive and maintain thermal performance in a prototypical NTR reactor environment of exposure to hydrogen at very high temperatures and obtain data to assess the properties of the non-nuclear support materials. The fission process and the resulting heating performance are well known and do not require that active fissile material to be integrated in this testing. A small-scale test bed; Compact Fuel Element Environmental Tester (CFEET), designed to heat fuel element samples via induction heating and expose samples to hydrogen is being developed at MSFC to assist in optimal material and manufacturing process selection without utilizing fissile material. This paper details the analytical approach to help design and optimize the test bed using COMSOL Multiphysics for predicting thermal gradients induced by electromagnetic heating (Induction heating) and Thermal Desktop for radiation calculations.

  11. Assessing Statistical Aspects of Test Fairness with Structural Equation Modelling

    Science.gov (United States)

    Kline, Rex B.

    2013-01-01

    Test fairness and test bias are not synonymous concepts. Test bias refers to statistical evidence that the psychometrics or interpretation of test scores depend on group membership, such as gender or race, when such differences are not expected. A test that is grossly biased may be judged to be unfair, but test fairness concerns the broader, more…

  12. Experimental test of nuclear magnetization distribution and nuclear structure models

    Energy Technology Data Exchange (ETDEWEB)

    Beirsdorfer, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lopez-Urrutia, J Crespo R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Utter, S. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    1999-02-26

    Models exist that ascribe the nuclear magnetic fields to the presence of a single nucleon whose spin is not neutralized by pairing it up with that of another nucleon; other models assume that the generation of the magnetic field is shared among some or all nucleons throughout the nucleus. All models predict the same magnetic field external to the nucleus since this is an anchor provided by experiments. The models differ, however, in their predictions of the magnetic field arrangement within the nucleus for which no data exist. The only way to distinguish which model gives the correct description of the nucleus would be to use a probe inserted into the nucleus. The goal of our project was to develop exactly such a probe and to use it to measure fundamental nuclear quantities that have eluded experimental scrutiny. The need for accurately knowing such quantities extends far beyond nuclear physics and has ramifications in parity violation experiments on atomic traps and the testing of the standard model in elementary particle physics. Unlike scattering experiments that employ streams of free particles, our technique to probe the internal magnetic field distribution of the nucleus rests on using a single bound electron. Quantum mechanics shows that an electron in the innermost orbital surrounding the nucleus constantly dives into the nucleus and thus samples the fields that exist inside. This sampling of the nucleus usually results in only minute shifts in the electron' s average orbital, which would be difficult to detect. By studying two particular energy states of the electron, we can, however, dramatically enhance the effects of the distribution of the magnetic fields in the nucleus. In fact about 2% of the energy difference between the two states, dubbed the hyperfine splitting, is determined by the effects related to the distribution of magnetic fields in the nucleus, A precise measurement of this energy difference (better than 0.01%) would then allow us to

  13. A test of convection models for IMF Bz north

    Science.gov (United States)

    Maynard, N. C.; Sojka, J. J.; Schunk, R. W.; Heppner, J. P.; Brace, L. H.

    1990-01-01

    The Utah State University Ionospheric Model was run to obtain diurnally reproducible ionospheric densities and temperatures for summer and winter conditions using both distorted two-cell and three-cell convection patterns. Differences due to the different convection patterns manifest themselves in the depth and location of polar holes in the F-region electron density. While the total depth of the model holes is a characteristic of the diurnally reproducible pattern, the features appear and are recognizable within 0.5 h. Langmuir probe data from 41 DE-2 passes, during which the IMF Bz component was northward, have been qualitatively checked against the model predictions. The cross polar cap electron density profiles of a large majority of the passes more closely conform to the distorted two-cell runs for both polarities of the IMF By component. This test can be generalized to rule out proposed convection patterns based on the presence/absence and position of polar electron density holes.

  14. Testing the stability and reliability of starspot modelling.

    Science.gov (United States)

    Kovari, Zs.; Bartus, J.

    1997-07-01

    Since the mid 70's different starspot modelling techniques have been used to describe the observed spot variability on active stars. Spot positions and temperatures are calculated by application of surface integration techniques or solution of analytic equations on observed photometric data. Artificial spotted light curves were generated, by use of the analytic expressions of Budding (1977Ap&SS..48..207B), to test how the different constraints like the intrinsic scatter of the observed data or the angle of inclination affects the spot solutions. Counteractions between the different parameters like inclination, latitude and spot size were also investigated. The results of re-modelling the generated data were scrutinized statistically. It was found, that (1) 0.002-0.005mag of photometric accuracy is required to recover geometrical spot parameters within an acceptable error box; (2) even a 0.03-0.05mag error in unspotted brightness substantially affects the recovery of the original spot distribution; (3) especially at low inclination, under- or overestimation of inclination by 10° leads to an important systematic error in spot latitude and size; (4) when the angle of inclination i<~20° photometric spot modelling is unable to provide satisfactory information on spot location and size.

  15. Community monitoring for youth violence surveillance: testing a prediction model.

    Science.gov (United States)

    Henry, David B; Dymnicki, Allison; Kane, Candice; Quintana, Elena; Cartland, Jenifer; Bromann, Kimberly; Bhatia, Shaun; Wisnieski, Elise

    2014-08-01

    Predictive epidemiology is an embryonic field that involves developing informative signatures for disorder and tracking them using surveillance methods. Through such efforts assistance can be provided to the planning and implementation of preventive interventions. Believing that certain minor crimes indicative of gang activity are informative signatures for the emergence of serious youth violence in communities, in this study we aim to predict outbreaks of violence in neighborhoods from pre-existing levels and changes in reports of minor offenses. We develop a prediction equation that uses publicly available neighborhood-level data on disorderly conduct, vandalism, and weapons violations to predict neighborhoods likely to have increases in serious violent crime. Data for this study were taken from the Chicago Police Department ClearMap reporting system, which provided data on index and non-index crimes for each of the 844 Chicago census tracts. Data were available in three month segments for a single year (fall 2009, winter, spring, and summer 2010). Predicted change in aggravated battery and overall violent crime correlated significantly with actual change. The model was evaluated by comparing alternative models using randomly selected training and test samples, producing favorable results with reference to overfitting, seasonal variation, and spatial autocorrelation. A prediction equation based on winter and spring levels of the predictors had area under the curve ranging from .65 to .71 for aggravated battery, and .58 to .69 for overall violent crime. We discuss future development of such a model and its potential usefulness in violence prevention and community policing.

  16. Stress-testing the Standard Model at the LHC

    CERN Document Server

    2016-01-01

    With the high-energy run of the LHC now underway, and clear manifestations of beyond-Standard-Model physics not yet seen in data from the previous run, the search for new physics at the LHC may be a quest for small deviations with big consequences. If clear signals are present, precise predictions and measurements will again be crucial for extracting the maximum information from the data, as in the case of the Higgs boson. Precision will therefore remain a key theme for particle physics research in the coming years. The conference will provide a forum for experimentalists and theorists to identify the challenges and refine the tools for high-precision tests of the Standard Model and searches for signals of new physics at Run II of the LHC. Topics to be discussed include: pinning down Standard Model corrections to key LHC processes; combining fixed-order QCD calculations with all-order resummations and parton showers; new developments in jet physics concerning jet substructure, associated jets and boosted je...

  17. A First Test of the Framed Standard Model against Experiment

    CERN Document Server

    Bordes, J; Tsou, ST

    2014-01-01

    The framed standard model (FSM) is obtained from the standard model by incorporating, as field variables, the frame vectors (vielbeins) in internal symmetry space. It gives the standard Higgs boson and 3 generations of quarks and leptons as immediate consequences. It gives moreover a fermion mass matrix of the form: $m = m_T \\alpha \\alpha^\\dagger$, where $\\alpha$ is a vector in generation space independent of the fermion species and rotating with changing scale, which has already been shown to lead, generically, to up-down mixing, neutrino oscillations and mass hierarchy. In this paper, pushing the FSM further, one first derives to 1-loop order the RGE for the rotation of $\\alpha$, and then applies it to fit mass and mixing data as a first test of the model. With 7 real adjustable parameters, 18 measured quantities are fitted, most (12) to within experimental error or to better than 0.5 percent, and the rest (6) not far off. (A summary of this fit can be found in Table 2 in the text.) Two notable features, bo...

  18. Testing the inhibitory cascade model in Mesozoic and Cenozoic mammaliaforms

    Science.gov (United States)

    2013-01-01

    Background Much of the current research in the growing field of evolutionary development concerns relating developmental pathways to large-scale patterns of morphological evolution, with developmental constraints on variation, and hence diversity, a field of particular interest. Tooth morphology offers an excellent model system for such ‘evo-devo’ studies, because teeth are well preserved in the fossil record, and are commonly used in phylogenetic analyses and as ecological proxies. Moreover, tooth development is relatively well studied, and has provided several testable hypotheses of developmental influences on macroevolutionary patterns. The recently-described Inhibitory Cascade (IC) Model provides just such a hypothesis for mammalian lower molar evolution. Derived from experimental data, the IC Model suggests that a balance between mesenchymal activators and molar-derived inhibitors determines the size of the immediately posterior molar, predicting firstly that molars either decrease in size along the tooth row, or increase in size, or are all of equal size, and secondly that the second lower molar should occupy one third of lower molar area. Here, we tested the IC Model in a large selection of taxa from diverse extant and fossil mammalian groups, ranging from the Middle Jurassic (~176 to 161 Ma) to the Recent. Results Results show that most taxa (~65%) fell within the predicted areas of the Inhibitory Cascade Model. However, members of several extinct groups fell into the regions where m2 was largest, or rarely, smallest, including the majority of the polyphyletic “condylarths”. Most Mesozoic mammals fell near the centre of the space with equality of size in all three molars. The distribution of taxa was significantly clustered by diet and by phylogenetic group. Conclusions Overall, the IC Model was supported as a plesiomorphic developmental system for Mammalia, suggesting that mammal tooth size has been subjected to this developmental constraint at

  19. Model Checking and Model-based Testing in the Railway Domain

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    This chapter describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k......-induction. Using real-world models of novel Danish interlocking systems, it is exemplified how this method scales up and is suitable for industrial application. For verification of the integrated HW/SW system performing the interlocking control tasks, a modelbased hardware-in-the-loop testing approach is presented...... with good test strength are explained. Interlocking systems represent just one class of many others, where concrete system instances are created from generic representations, using configuration data for determining the behaviour of the instances. We explain how the systematic transition from generic...

  20. Mechanical modeling of porous oxide fuel pellet A Test Problem

    Energy Technology Data Exchange (ETDEWEB)

    Nukala, Phani K [ORNL; Barai, Pallab [ORNL; Simunovic, Srdjan [ORNL; Ott, Larry J [ORNL

    2009-10-01

    A poro-elasto-plastic material model has been developed to capture the response of oxide fuels inside the nuclear reactors under operating conditions. Behavior of the oxide fuel and variation in void volume fraction under mechanical loading as predicted by the developed model has been reported in this article. The significant effect of void volume fraction on the overall stress distribution of the fuel pellet has also been described. An important oxide fuel issue that can have significant impact on the fuel performance is the mechanical response of oxide fuel pellet and clad system. Specifically, modeling the thermo-mechanical response of the fuel pellet in terms of its thermal expansion, mechanical deformation, swelling due to void formation and evolution, and the eventual contact of the fuel with the clad is of significant interest in understanding the fuel-clad mechanical interaction (FCMI). These phenomena are nonlinear and coupled since reduction in the fuel-clad gap affects thermal conductivity of the gap, which in turn affects temperature distribution within the fuel and the material properties of the fuel. Consequently, in order to accurately capture fuel-clad gap closure, we need to account for fuel swelling due to generation, retention, and evolution of fission gas in addition to the usual thermal expansion and mechanical deformation. Both fuel chemistry and microstructure also have a significant effect on the nucleation and growth of fission gas bubbles. Fuel-clad gap closure leading to eventual contact of the fuel with the clad introduces significant stresses in the clad, which makes thermo-mechanical response of the clad even more relevant. The overall aim of this test problem is to incorporate the above features in order to accurately capture fuel-clad mechanical interaction. Because of the complex nature of the problem, a series of test problems with increasing multi-physics coupling features, modeling accuracy, and complexity are defined with the

  1. Leptonic Precision Test of Leptophilic Two-Higgs-Doublet Model

    CERN Document Server

    Chun, Eung Jin

    2016-01-01

    The type X (lepton-specific) two-Higgs-doublet model at large $\\tan\\beta$ becomes leptophilic and thus allows a light pseudoscalar $A$ accommodating the observed muon g-2 deviation without conflicting with various hadronic constraints. On the other hand, it is strongly constrained by leptonic precision observables such as lepton universality test in the neutral and charged currents. Treating all the lepton universality data in a consistent way, we show how the current data constrain the parameter space of $m_A$ and $\\tan\\beta$ for given degenerate masses of heavy Higgs bosons $H$ and $H^\\pm$. While no overlapping region is found at $1\\sigma$, a sizable region is still viable at $2\\sigma$ for $H/H^\\pm$ masses at around 200$\\sim$400 GeV.

  2. Effective Transparency: A Test of Atomistic Laser-Cluster Models

    CERN Document Server

    Pandit, Rishi; Teague, Thomas; Hartwick, Zachary; Bigaouette, Nicolas; Ramunno, Lora; Ackad, Edward

    2016-01-01

    The effective transparency of rare-gas clusters, post-interaction with an extreme ultraviolet (XUV) pump pulse, is studied by using an atomistic hybrid quantum-classical molecular dynamics model. We find there is an intensity range in which an XUV probe pulse has no lasting effect on the average charge state of a cluster after being saturated by an XUV pump pulse: the cluster is effectively transparent to the probe pulse. The range of this phenomena increases with the size of the cluster and thus provides an excellent candidate for an experimental test of the effective transparency effect. We present predictions for the clusters at the peak of the laser pulse as well as the experimental time-of-flight signal expected along with trends which can be compared with. Significant deviations from these predictions would provide evidence for enhanced photoionization mechanism(s).

  3. IMPLEMENTATION OF INTERTIAL NAVIGATION SYSTEM MODEL DURING AIRCRAFT TESTING

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The flight subset control is required during the aviation equipment test flights. In order to achieve this objective the complex consisting of strap down inertial navigation system (SINS and user equipment of satellite navigation systems (SNS can be used. Such combination needs to be used for error correction in positioning which is accumulated in SINS with time. This article shows the research results of the inertial navigation system (INS model. The results of the position- ing error calculation for various INS classes are given. Each of the examined INS has a different accumulated error for the same time lag. The methods of combining information of INS and SRNS are covered. The results obtained can be applied for upgrading the aircraft flight and navigation complexes. In particular, they can allow to continuously determine speed, coordinates, angular situation and repositioning rate of change of axes of the instrument frame.

  4. Mouse models of autism: testing hypotheses about molecular mechanisms.

    Science.gov (United States)

    Roullet, Florence I; Crawley, Jacqueline N

    2011-01-01

    Autism is a neurodevelopmental disorder that is currently diagnosed by the presence of three behavioral criteria (1) qualitative impairments in reciprocal social interactions, (2) deficits in communication, including delayed language and noninteractive conversation, and (3) motor stereotypies, repetitive behaviors, insistence on sameness, and restricted interests. This chapter describes analogous behavioral assays that have been developed for mice, including tests for social approach, reciprocal social interactions, olfactory communication, ultrasonic vocalizations, repetitive and perseverative behaviors, and motor stereotypies. Examples of assay applications to genetic mouse models of autism are provided. Robust endophenotypes that are highly relevant to the core symptoms of autism are enabling the search for the genetic and environmental causes of autism, and the discovery of effective treatments.

  5. Modeling Gravitational Waves to Test GR Dispersion and Polarization

    Science.gov (United States)

    Tso, Rhondale; Chen, Yanbei; Isi, Maximilliano

    2017-01-01

    Given continued observation runs from the Laser Interferometer Gravitational-Wave Observatory Scientific Collaboration, further gravitational wave (GW) events will provide added constraints on beyond-general relativity (b-GR) theories. One approach, independent of the GW generation mechanism at the source, is to look at modification to the GW dispersion and propagation, which can accumulate over vast distances. Generic modification of GW propagation can also, in certain b-GR theories, impact the polarization content of GWs. To this end, a comprehensive approach to testing the dispersion and polarization content is developed by modeling anisotropic deformations to the waveforms' phase, along with birefringence effects and corollary consequences for b-GR polarizations, i.e., breathing, vector, and longitudinal modes. Such an approach can be mapped to specific theories like Lorentz violation, amplitude birefringence in Chern-Simons, and provide hints at additional theories to be included. An overview of data analysis routines to be implemented will also be discussed.

  6. Modeling motive activation in the Operant Motives Test

    DEFF Research Database (Denmark)

    Runge, J. Malte; Lang, Jonas W. B.; Engeser, Stefan

    2016-01-01

    The Operant Motive Test (OMT) is a picture-based procedure that asks respondents to generate imaginative verbal behavior that is later coded for the presence of affiliation, power, and achievement-related motive content by trained coders. The OMT uses a larger number of pictures and asks...... respondents to provide more brief answers than earlier and more traditional picture-based implicit motive measures and has therefore become a frequently used measurement instrument in both research and practice. This article focuses on the psychometric response mechanism in the OMT and builds on recent...... measures (Lang, 2014) and reports the first analysis of which we are aware that applies this model to OMT data (N = 633) and studies dynamic motive activation in the OMT. Results of this analysis yielded evidence for dynamic motive activation in the OMT and showed that simulated IRT reliabilities based...

  7. Setup of IN VIVO Breast Cancer Models for Nanodrug Testing

    DEFF Research Database (Denmark)

    Schifter, Søren

    for detection of the primary tumor and metastasis and the efficacy of siRNA delivery is measured by reporter gene-targeting siRNAs and in vivo imaging. The use of a uniform siRNA not affecting cellular processes would allow for standardized assessment of siRNA delivery to cancer cells without interferences via......RNA/aptamer conjugates, or carriers such as liposome/chitosan/micelle spheres. As a first step towards testing of the efficacy of siRNA delivery in vivo via different conjugates and complexes, we aimed at developing a standardized breast cancer model system in mice. In this conception, a reporter gene is used...... differential knockdown efficacies and the readout can directly be performed by quantitative imaging using a Caliper IVIS system. In one line of experiments, we engineered non-metastatic MCF-7 breast cancer cells to express the luminescent reporter firefly luciferase (Luc2) along with a pro-metastatic micro...

  8. Air Conditioning Stall Phenomenon Testing, Model Development, and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Irminger, Philip [ORNL; Rizy, D Tom [ORNL; Li, Huijuan [ORNL; Smith, Travis [ORNL; Rice, C Keith [ORNL; Li, Fangxing [ORNL; Adhikari, Sarina [ORNL

    2012-01-01

    Electric distribution systems are experiencing power quality issues of extended reduced voltage due to fault-induced delayed voltage recovery (FIDVR). FIDVR occurs in part because modern air conditioner (A/C) and heat pump compressor motors are much more susceptible to stalling during a voltage sag or dip such as a sub-transmission fault. They are more susceptible than older A/C compressor motors due to the low inertia of these newer and more energy efficient motors. There is a concern that these local reduced voltage events on the distribution system will become more frequent and prevalent and will combine over larger areas and challenge transmission system voltage and ultimately power grid reliability. The Distributed Energy Communications and Controls (DECC) Laboratory at Oak Ridge National Laboratory (ORNL) has been employed to (1) test, (2) characterize and (3) model the A/C stall phenomenon.

  9. Testing the tidal alignment model of galaxy intrinsic alignment

    CERN Document Server

    Blazek, Jonathan; Seljak, Uros

    2011-01-01

    Weak gravitational lensing has become a powerful probe of large-scale structure and cosmological parameters. Precision weak lensing measurements require an understanding of the intrinsic alignment of galaxy ellipticities, which can in turn inform models of galaxy formation. It is hypothesized that elliptical galaxies align with the background tidal field and that this alignment mechanism dominates the correlation between ellipticities on cosmological scales (in the absence of lensing). We use recent large-scale structure measurements from the Sloan Digital Sky Survey to test this picture with several statistics: (1) the correlation between ellipticity and galaxy overdensity, w_{g+}; (2) the intrinsic alignment auto-correlation functions; (3) the correlation functions of curl-free, E, and divergence-free, B, modes (the latter of which is zero in the linear tidal alignment theory); (4) the alignment correlation function, w_g(r_p,theta), a recently developed statistic that generalizes the galaxy correlation func...

  10. Modelling and Testing of Blast Effect On the Structures

    Science.gov (United States)

    Figuli, Lucia; Jangl, Štefan; Papán, Daniel

    2016-10-01

    As a blasting agent in the blasting and mining engineering, has been using one of so called new generation of explosives which offer greater flexibility in their range and application, and such explosive is ANFO. It is type of explosive consists of an oxidiser and a fuel (ammonium nitrate and fuel oil). One of such ANFO explosives which are industrially made in Slovakia is POLONIT. The explosive is a mixture of ammonium nitrate, methyl esters of higher fatty acids, vegetable oil and red dye. The paper deals with the analysis of structure subjected to the blast load created by the explosion of POLONIT charge. First part of paper is describing behaviour and characteristic of blast wave generated from the blast (detonation characteristics, physical characteristics, time-history diagram etc.) and the second part presents the behaviour of such loaded structures, because of the analysis of such dynamical loaded structure is required knowing the parameters of blast wave, its effect on structure and the tools for the solution of dynamic analysis. The real field tests of three different weight of charges and two different structures were done. The explosive POLONIT was used together with 25 g of ignition explosive PLNp10. Analytical and numerical model of blast loaded structure is compared with the results obtained from the field tests (is compared with the corresponding experimental accelerations). For the modelling structures were approximated as a one-degree system of freedom (SDOF), where the blast wave was estimated with linear decay and exponential decay using positive and negative phase of blast wave. Numerical solution of the steel beam dynamic response was performed via FEM (Finite Element Method) using standard software Visual FEA.

  11. Use of the EFPA Test Review Model by the UK and Issues Relating to the Internationalization of Test Standards

    Science.gov (United States)

    Lindley, Patricia A.; Bartram, Dave

    2012-01-01

    In this article, we present the background to the development of test reviewing by the British Psychological Society (BPS) in the United Kingdom. We also describe the role played by the BPS in the development of the EFPA test review model and its adaptation for use in test reviewing in the United Kingdom. We conclude with a discussion of lessons…

  12. PICASSO VISION instrument design, engineering model test results, and flight model development status

    Science.gov (United States)

    Näsilä, Antti; Holmlund, Christer; Mannila, Rami; Näkki, Ismo; Ojanen, Harri J.; Akujärvi, Altti; Saari, Heikki; Fussen, Didier; Pieroux, Didier; Demoulin, Philippe

    2016-10-01

    PICASSO - A PICo-satellite for Atmospheric and Space Science Observations is an ESA project led by the Belgian Institute for Space Aeronomy, in collaboration with VTT Technical Research Centre of Finland Ltd, Clyde Space Ltd. (UK) and Centre Spatial de Liège (BE). The test campaign for the engineering model of the PICASSO VISION instrument, a miniaturized nanosatellite spectral imager, has been successfully completed. The test results look very promising. The proto-flight model of VISION has also been successfully integrated and it is waiting for the final integration to the satellite platform.

  13. Fabrication, Testing and Modeling of the MICE Superconducting Spectrometer Solenoids

    Energy Technology Data Exchange (ETDEWEB)

    Virostek, S.P.; Green, M.A.; Trillaud, F.; Zisman, M.S.

    2010-05-16

    The Muon Ionization Cooling Experiment (MICE), an international collaboration sited at Rutherford Appleton Laboratory in the UK, will demonstrate ionization cooling in a section of realistic cooling channel using a muon beam. A five-coil superconducting spectrometer solenoid magnet will provide a 4 tesla uniform field region at each end of the cooling channel. Scintillating fiber trackers within the 400 mm diameter magnet bore tubes measure the emittance of the beam as it enters and exits the cooling channel. Each of the identical 3-meter long magnets incorporates a three-coil spectrometer magnet section and a two-coil section to match the solenoid uniform field into the other magnets of the MICE cooling channel. The cold mass, radiation shield and leads are currently kept cold by means of three two-stage cryocoolers and one single-stage cryocooler. Liquid helium within the cold mass is maintained by means of a re-condensation technique. After incorporating several design changes to improve the magnet cooling and reliability, the fabrication and acceptance testing of the spectrometer solenoids have proceeded. The key features of the spectrometer solenoid magnets, the development of a thermal model, the results of the recently completed tests, and the current status of the project are presented.

  14. Study of Semi-Span Model Testing Techniques

    Science.gov (United States)

    Gatlin, Gregory M.; McGhee, Robert J.

    1996-01-01

    An investigation has been conducted in the NASA Langley 14- by 22-Foot Subsonic Tunnel in order to further the development of semi-span testing capabilities. A twin engine, energy efficient transport (EET) model with a four-element wing in a takeoff configuration was used for this investigation. Initially a full span configuration was tested and force and moment data, wing and fuselage surface pressure data, and fuselage boundary layer measurements were obtained as a baseline data set. The semi-span configurations were then mounted on the wind tunnel floor, and the effects of fuselage standoff height and shape as well as the effects of the tunnel floor boundary layer height were investigated. The effectiveness of tangential blowing at the standoff/floor juncture as an active boundary-layer control technique was also studied. Results indicate that the semi-span configuration was more sensitive to variations in standoff height than to variations in floor boundary layer height. A standoff height equivalent to 30 percent of the fuselage radius resulted in better correlation with full span data than no standoff or the larger standoff configurations investigated. Undercut standoff leading edges or the use of tangential blowing in the standoff/ floor juncture improved correlation of semi-span data with full span data in the region of maximum lift coefficient.

  15. Testing the habituation assumption underlying models of parasitoid foraging behavior

    Science.gov (United States)

    Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of parasitoid behavioral strategies. However, parasitoid behavioral responses to host cues have not previously been tested for the known, specific characteristics of habituation. Methods In the laboratory, we tested whether the foraging behavior of the egg parasitoid Trissolcus basalis shows specific characteristics of habituation in response to consecutive encounters with patches of host (Nezara viridula) chemical contact cues (footprints), in particular: (i) a training interval-dependent decline in response intensity, and (ii) a training interval-dependent recovery of the response. Results As would be expected of a habituated response, wasps trained at higher frequencies decreased their behavioral response to host footprints more quickly and to a greater degree than those trained at low frequencies, and subsequently showed a more rapid, although partial, recovery of their behavioral response to host footprints. This putative habituation learning could not be blocked by cold anesthesia, ingestion of an ATPase inhibitor, or ingestion of a protein synthesis inhibitor. Discussion Our study provides support for the assumption that diminishing responses of parasitoids to chemical indicators of host presence constitutes habituation as opposed to sensory fatigue, and provides a preliminary basis for exploring the underlying mechanisms. PMID:28321365

  16. Testing two process models of religiosity and sexual behavior.

    Science.gov (United States)

    Vasilenko, Sara A; Duntzee, Christina I; Zheng, Yao; Lefkowitz, Eva S

    2013-08-01

    Adolescents who are more religious are less likely to have sex, but the process by which religiosity impacts sexual behavior is not well established. We tested two potential processes, involving: (1) whether religiosity suppressed individuals' motivations to have sex for physical pleasure, and (2) whether individuals internalized their religions' teachings about sex for pleasure. College students (N = 610, 53.8% female, M age = 18.5, 26.1% Hispanic Latino [HL], 14.9% non-HL African American, 23.8% non-HL Asian American/Pacific Islander, 26.3% non-HL European American and 8.9% non-HL multiracial) completed web surveys during their first three semesters. Religiosity did not moderate the association between students' motivations for sex for pleasure and sexual behavior. Motivations mediated the association between religiosity and sexual behavior, suggesting that religion does not override adolescents' existing motivations, but instead, religious adolescents internalize norms about sexual behavior. Testing Two Process Models of Religiosity and Sexual Behavior. Copyright © 2013 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  17. Spatiotemporal properties of microsaccades: Model predictions and experimental tests

    Science.gov (United States)

    Zhou, Jian-Fang; Yuan, Wu-Jie; Zhou, Zhao

    2016-10-01

    Microsaccades are involuntary and very small eye movements during fixation. Recently, the microsaccade-related neural dynamics have been extensively investigated both in experiments and by constructing neural network models. Experimentally, microsaccades also exhibit many behavioral properties. It’s well known that the behavior properties imply the underlying neural dynamical mechanisms, and so are determined by neural dynamics. The behavioral properties resulted from neural responses to microsaccades, however, are not yet understood and are rarely studied theoretically. Linking neural dynamics to behavior is one of the central goals of neuroscience. In this paper, we provide behavior predictions on spatiotemporal properties of microsaccades according to microsaccade-induced neural dynamics in a cascading network model, which includes both retinal adaptation and short-term depression (STD) at thalamocortical synapses. We also successfully give experimental tests in the statistical sense. Our results provide the first behavior description of microsaccades based on neural dynamics induced by behaving activity, and so firstly link neural dynamics to behavior of microsaccades. These results indicate strongly that the cascading adaptations play an important role in the study of microsaccades. Our work may be useful for further investigations of the microsaccadic behavioral properties and of the underlying neural dynamical mechanisms responsible for the behavioral properties.

  18. Squeezing the halo bispectrum: a test of bias models

    CERN Document Server

    Dizgah, Azadeh Moradinezhad; Noreña, Jorge; Biagetti, Matteo; Desjacques, Vincent

    2015-01-01

    We study the halo-matter cross bispectrum in the presence of primordial non-Gaussianity of the local type. We restrict ourselves to the squeezed limit, for which the calculation are straightforward, and perform the measurements in the initial conditions of N-body simulations, to mitigate the contamination induced by nonlinear gravitational evolution. Interestingly, the halo-matter cross bispectrum is not trivial even in this simple limit as it is strongly sensitive to the scale-dependence of the quadratic and third-order halo bias. Therefore, it can be used to test biasing prescriptions. We consider three different prescription for halo clustering: excursion set peaks (ESP), local bias and a model in which the halo bias parameters are explicitly derived from a peak-background split. In all cases, the model parameters are fully constrained with statistics other than the cross bispectrum. We measure the cross bispectrum involving one halo fluctuation field and two mass overdensity fields for various halo masses...

  19. Analytical Scenario of Software Testing Using Simplistic Cost Model

    OpenAIRE

    RAJENDER BATHLA; Dr. ANIL KAPIL

    2012-01-01

    Software can be tested either manually or automatically.The two approaches are complementary: automated testingcan perform a huge number of tests in short time or period,whereas manual testing uses the knowledge of the testingengineer to target testing to the parts of the system that areassumed to be more error-prone. Despite this contemporary,tools for manual and automatic testing are usually different,leading to decreased productivity and reliability of thetesting process. Auto Test is a te...

  20. Deterministic Modeling of the High Temperature Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ortensi, J.; Cogliati, J. J.; Pope, M. A.; Ferrer, R. M.; Ougouag, A. M.

    2010-06-01

    Idaho National Laboratory (INL) is tasked with the development of reactor physics analysis capability of the Next Generation Nuclear Power (NGNP) project. In order to examine INL’s current prismatic reactor deterministic analysis tools, the project is conducting a benchmark exercise based on modeling the High Temperature Test Reactor (HTTR). This exercise entails the development of a model for the initial criticality, a 19 column thin annular core, and the fully loaded core critical condition with 30 columns. Special emphasis is devoted to the annular core modeling, which shares more characteristics with the NGNP base design. The DRAGON code is used in this study because it offers significant ease and versatility in modeling prismatic designs. Despite some geometric limitations, the code performs quite well compared to other lattice physics codes. DRAGON can generate transport solutions via collision probability (CP), method of characteristics (MOC), and discrete ordinates (Sn). A fine group cross section library based on the SHEM 281 energy structure is used in the DRAGON calculations. HEXPEDITE is the hexagonal z full core solver used in this study and is based on the Green’s Function solution of the transverse integrated equations. In addition, two Monte Carlo (MC) based codes, MCNP5 and PSG2/SERPENT, provide benchmarking capability for the DRAGON and the nodal diffusion solver codes. The results from this study show a consistent bias of 2–3% for the core multiplication factor. This systematic error has also been observed in other HTTR benchmark efforts and is well documented in the literature. The ENDF/B VII graphite and U235 cross sections appear to be the main source of the error. The isothermal temperature coefficients calculated with the fully loaded core configuration agree well with other benchmark participants but are 40% higher than the experimental values. This discrepancy with the measurement stems from the fact that during the experiments the

  1. Non-gaussian Test Models for Prediction and State Estimation with Model Errors

    Institute of Scientific and Technical Information of China (English)

    Michal BRANICKI; Nan CHEN; Andrew J.MAJDA

    2013-01-01

    Turbulent dynamical systems involve dynamics with both a large dimensional phase space and a large number of positive Lyapunov exponents.Such systems are ubiquitous in applications in contemporary science and engineering where the statistical ensemble prediction and the real time filtering/state estimation are needed despite the underlying complexity of the system.Statistically exactly solvable test models have a crucial role to provide firm mathematical underpinning or new algorithms for vastly more complex scientific phenomena.Here,a class of statistically exactly solvable non-Gaussian test models is introduced,where a generalized Feynman-Kac formulation reduces the exact behavior of conditional statistical moments to the solution to inhomogeneous Fokker-Planck equations modified by linear lower order coupling and source terms.This procedure is applied to a test model with hidden instabilities and is combined with information theory to address two important issues in the contemporary statistical prediction of turbulent dynamical systems:the coarse-gained ensemble prediction in a perfect model and the improving long range forecasting in imperfect models.The models discussed here should be useful for many other applications and algorithms for the real time prediction and the state estimation.

  2. Physically-based landslide susceptibility modelling: geotechnical testing and model evaluation issues

    Science.gov (United States)

    Marchesini, Ivan; Mergili, Martin; Schneider-Muntau, Barbara; Alvioli, Massimiliano; Rossi, Mauro; Guzzetti, Fausto

    2015-04-01

    We used the software r.slope.stability for physically-based landslide susceptibility modelling in the 90 km² Collazzone area, Central Italy, exploiting a comprehensive set of lithological, geotechnical, and landslide inventory data. The model results were evaluated against the inventory. r.slope.stability is a GIS-supported tool for modelling shallow and deep-seated slope stability and slope failure probability at comparatively broad scales. Developed as a raster module of the GRASS GIS software, r.slope.stability evaluates the slope stability for a large number of randomly selected ellipsoidal potential sliding surfaces. The bottom of the soil (for shallow slope stability) or the bedding planes of lithological layers (for deep-seated slope stability) are taken as potential sliding surfaces by truncating the ellipsoids, allowing for the analysis of relatively complex geological structures. To take account for the uncertain geotechnical and geometric parameters, r.slope.stability computes the slope failure probability by testing multiple parameter combinations sampled deterministically or stochastically, and evaluating the ratio between the number of parameter combinations yielding a factor of safety below 1 and the total number of tested combinations. Any single raster cell may be intersected by multiple sliding surfaces, each associated with a slope failure probability. The most critical sliding surface is relevant for each pixel. Intensive use of r.slope.stability in the Collazzone Area has opened up two questions elaborated in the present work: (i) To what extent does a larger number of geotechnical tests help to better constrain the geotechnical characteristics of the study area and, consequently, to improve the model results? The ranges of values of cohesion and angle of internal friction obtained through 13 direct shear tests corresponds remarkably well to the range of values suggested by a geotechnical textbook. We elaborate how far an increased number of

  3. Testing 40 Predictions from the Transtheoretical Model Again, with Confidence

    Science.gov (United States)

    Velicer, Wayne F.; Brick, Leslie Ann D.; Fava, Joseph L.; Prochaska, James O.

    2013-01-01

    Testing Theory-based Quantitative Predictions (TTQP) represents an alternative to traditional Null Hypothesis Significance Testing (NHST) procedures and is more appropriate for theory testing. The theory generates explicit effect size predictions and these effect size estimates, with related confidence intervals, are used to test the predictions.…

  4. Vibratory gyroscopes : identification of mathematical model from test data

    CSIR Research Space (South Africa)

    Shatalov, MY

    2007-05-01

    Full Text Available by an adaptive Runge-Kutta method with the same initial conditions as from the test data. Results of comparison of the test data and results of numerical integration of equations (6) are shown in Fig.3-6. Fig. 3. Runge-Kutta and test data of in-phase X...-channel components (---- - Runge-Kutta integration; - - - - test data) Fig. 4. Runge-Kutta and test data of quadrature X-channel components (---- - Runge-Kutta integration; - - - - test data) Fig. 5. Runge-Kutta and test data of in-phase Y...

  5. A NEW TEST FOR NORMALITY IN LINEAR AUTOREGRESSIVE MODELS

    Institute of Scientific and Technical Information of China (English)

    CHEN Min; WU Guofu; Gemai Chen

    2002-01-01

    A nonparametric test for normality of linear autoregressive time series isproposed in this paper. The test is based on the best one-step forecast in mean squarewith time reverse. Some asymptotic theory is developed for the test, and it is shown thatthe test is easy to use and has good powers. The empirical percentage points to conductthe test in practice are provided and three examples using real data are included.

  6. Wave basin model tests of technical-biological bank protection

    Science.gov (United States)

    Eisenmann, J.

    2012-04-01

    Sloped embankments of inland waterways are usually protected from erosion and other negative im-pacts of ship-induced hydraulic loads by technical revetments consisting of riprap. Concerning the dimensioning of such bank protection there are several design rules available, e.g. the "Principles for the Design of Bank and Bottom Protection for Inland Waterways" or the Code of Practice "Use of Standard Construction Methods for Bank and Bottom Protection on Waterways" issued by the BAW (Federal Waterways Engineering and Research Institute). Since the European Water Framework Directive has been put into action special emphasis was put on natural banks. Therefore the application of technical-biological bank protection is favoured. Currently design principles for technical-biological bank protection on inland waterways are missing. The existing experiences mainly refer to flowing waters with no or low ship-induced hydraulic loads on the banks. Since 2004 the Federal Waterways Engineering and Research Institute has been tracking the re-search and development project "Alternative Technical-Biological Bank Protection on Inland Water-ways" in company with the Federal Institute of Hydrology. The investigation to date includes the ex-amination of waterway sections where technical- biological bank protection is applied locally. For the development of design rules for technical-biological bank protection investigations shall be carried out in a next step, considering the mechanics and resilience of technical-biological bank protection with special attention to ship-induced hydraulic loads. The presentation gives a short introduction into hydraulic loads at inland waterways and their bank protection. More in detail model tests of a willow brush mattress as a technical-biological bank protec-tion in a wave basin are explained. Within the scope of these tests the brush mattresses were ex-posed to wave impacts to determine their resilience towards hydraulic loads. Since the

  7. Using ISOS consensus test protocols for development of quantitative life test models in ageing of organic solar cells

    DEFF Research Database (Denmark)

    Kettle, J.; Stoichkov, V.; Kumar, D.

    2017-01-01

    As Organic Photovoltaic (OPV) development matures, the demand grows for rapid characterisation of degradation and application of Quantitative Accelerated Life Tests (QALT) models to predict and improve reliability. To date, most accelerated testing on OPVs has been conducted using ISOS consensus...... standards. This paper identifies some of the problems in using and interpreting the results for predicting ageing based upon ISOS consensus standard test data. Design of Experiments (DOE) in conjunction with data from ISOS consensus standards are used as the basis for developing life test models for OPV...

  8. Analytical Scenario of Software Testing Using Simplistic Cost Model

    Directory of Open Access Journals (Sweden)

    RAJENDER BATHLA

    2012-02-01

    Full Text Available Software can be tested either manually or automatically.The two approaches are complementary: automated testingcan perform a huge number of tests in short time or period,whereas manual testing uses the knowledge of the testingengineer to target testing to the parts of the system that areassumed to be more error-prone. Despite this contemporary,tools for manual and automatic testing are usually different,leading to decreased productivity and reliability of thetesting process. Auto Test is a testing tool that provides a“best of both worlds” strategy: it integrates developers’ testcases into an automated process of systematic contractdriventesting.This allows it to combine the benefits of both approacheswhile keeping a simple interface, and to treat the two typesof tests in a unified fashion: evaluation of results is thesame, coverage measures are added up, and both types oftests can be saved in the same format. The objective of thispaper is to discuss the Importance of Automation tool withassociate to software testing techniques in softwareengineering. In this paper we provide introduction ofsoftware testing and describe the CASE tools. The solutionof this problem leads to the new approach of softwaredevelopment known as software testing in the IT world.Software Test Automation is the process of automating thesteps of manual test cases using an automation tool or utilityto shorten the testing life cycle with respect to time.

  9. Identification of a coupled flapping/inflow model for the PUMA helicopter from flight test data

    Science.gov (United States)

    Du Val, Ronald; Bruhis, Ofer; Green, John

    1989-01-01

    A model validation procedure is applied to a coupled flapping/inflow model of a PUMA helicopter blade. The structure of the baseline model is first established. Model structure and flight test data are checked for consistency. Parameters of the model are then identified from the flight test data.

  10. A test of the multiple connections model of reading acquisition.

    Science.gov (United States)

    Berninger, V W; Chen, A C; Abbott, R D

    1988-10-01

    Within the framework of Society of Mind Theory (Minsky, 1986), learning to read is conceptualized as a process of creating new communication links or neural connections between an existing visual society and an existing linguistic society. Four visual-linguistic connections may become functional: letter-phonemic code, whole word-semantic code, whole word-name code, letter sequence-aural syllabic code. The hypothesis was tested that more than one of these visual-linguistic connections must be taken into account in predicting reading achievement. Results showed that the combination of the composite letter-phoneme variable and the composite whole word-semantic code variable accounted for significantly more variance in oral reading than did either single variable at the end of the first grade. Groups with large absolute discrepancy (1 or more standard scores) or small absolute discrepancy (1/3 standard score or less) on corresponding visual and linguistic skills differed significantly in both oral (whole word-semantic code composite) and silent reading (whole word-semantic code and letter sequence-aural syllabic code composites). There was a relationship between the number of large discrepancies and reading achievement. Results are discussed in reference to neuropsychological models of connectionism (Rumelhart & McClelland, 1986) and working brain systems (Luria, 1973).

  11. Parameter estimation and hypothesis testing in linear models

    CERN Document Server

    Koch, Karl-Rudolf

    1999-01-01

    The necessity to publish the second edition of this book arose when its third German edition had just been published. This second English edition is there­ fore a translation of the third German edition of Parameter Estimation and Hypothesis Testing in Linear Models, published in 1997. It differs from the first English edition by the addition of a new chapter on robust estimation of parameters and the deletion of the section on discriminant analysis, which has been more completely dealt with by the author in the book Bayesian In­ ference with Geodetic Applications, Springer-Verlag, Berlin Heidelberg New York, 1990. Smaller additions and deletions have been incorporated, to im­ prove the text, to point out new developments or to eliminate errors which became apparent. A few examples have been also added. I thank Springer-Verlag for publishing this second edition and for the assistance in checking the translation, although the responsibility of errors remains with the author. I also want to express my thanks...

  12. Can atom-surface potential measurements test atomic structure models?

    Science.gov (United States)

    Lonij, Vincent P A; Klauss, Catherine E; Holmgren, William F; Cronin, Alexander D

    2011-06-30

    van der Waals (vdW) atom-surface potentials can be excellent benchmarks for atomic structure calculations. This is especially true if measurements are made with two different types of atoms interacting with the same surface sample. Here we show theoretically how ratios of vdW potential strengths (e.g., C₃(K)/C₃(Na)) depend sensitively on the properties of each atom, yet these ratios are relatively insensitive to properties of the surface. We discuss how C₃ ratios depend on atomic core electrons by using a two-oscillator model to represent the contribution from atomic valence electrons and core electrons separately. We explain why certain pairs of atoms are preferable to study for future experimental tests of atomic structure calculations. A well chosen pair of atoms (e.g., K and Na) will have a C₃ ratio that is insensitive to the permittivity of the surface, whereas a poorly chosen pair (e.g., K and He) will have a ratio of C₃ values that depends more strongly on the permittivity of the surface.

  13. Photogrammetric analysis of rubble mound breakwaters scale model tests

    Directory of Open Access Journals (Sweden)

    João Rodrigues

    2016-09-01

    Full Text Available The main goal of this paper is to develop a photogrammetric method in order to obtain arobust tool for damage assessment and quantification of rubble-mound armour layers during physicalscale model tests. With the present work, an innovative approach based on a reduced number ofdigital photos is proposed to support the identification of affected areas. This work considers twosimple digital photographs recording the instants before and after the completion of the physicaltest. Mathematical techniques were considered in the development of the procedures, enabling thetracking of image differences between photos. The procedures were developed using an open-sourceapplication, Scilab, nevertheless they are not platform dependent. The procedures developed enablethe location and identity of eroded areas in the breakwater armour layer, as well as the possibilityof quantifying them. This ability is confirmed through the calculation of correlation coefficients ineach step of the search for the more damaged area. It is also possible to make an assessment of themovement of armour layer units.

  14. Testing evolutionary models of senescence: traditional approaches and future directions.

    Science.gov (United States)

    Robins, Chloe; Conneely, Karen N

    2014-12-01

    From an evolutionary perspective, the existence of senescence is a paradox. Why has senescence not been more effectively selected against given its associated decreases in Darwinian fitness? Why does senescence exist and how has it evolved? Three major theories offer explanations: (1) the theory of mutation accumulation suggested by PB Medawar; (2) the theory of antagonistic pleiotropy suggested by GC Williams; and (3) the disposable soma theory suggested by TBL Kirkwood. These three theories differ in the underlying causes of aging that they propose but are not mutually exclusive. This paper compares the specific biological predictions of each theory and discusses the methods and results of previous empirical tests. Lifespan is found to be the most frequently used estimate of senescence in evolutionary investigations. This measurement acts as a proxy for an individual's rate of senescence, but provides no information on an individual's senescent state or "biological age" throughout life. In the future, use of alternative longitudinal measures of senescence may facilitate investigation of previously neglected aspects of evolutionary models, such as intra- and inter-individual heterogeneity in the process of aging. DNA methylation data are newly proposed to measure biological aging and are suggested to be particularly useful for such investigations.

  15. RADBALLTECHNOLOGY TESTING AND MCNP MODELING OF THE TUNGSTEN COLLIMATOR

    Energy Technology Data Exchange (ETDEWEB)

    Farfan, E.

    2010-07-08

    The United Kingdom's National Nuclear Laboratory (NNL) has developed a remote, non-electrical, radiation-mapping device known as RadBall{trademark}, which can locate and quantify radioactive hazards within contaminated areas of the nuclear industry. RadBall{trademark} consists of a colander-like outer shell that houses a radiation-sensitive polymer sphere. The outer shell works to collimate radiation sources and those areas of the polymer sphere that are exposed react, becoming increasingly more opaque, in proportion to the absorbed dose. The polymer sphere is imaged in an optical-CT scanner, which produces a high resolution 3D map of optical attenuation coefficients. Subsequent analysis of the optical attenuation matrix provides information on the spatial distribution of sources in a given area forming a 3D characterization of the area of interest. RadBall{trademark} has no power requirements and can be positioned in tight or hard-to reach locations. The RadBall{trademark} technology has been deployed in a number of technology trials in nuclear waste reprocessing plants at Sellafield in the United Kingdom and facilities of the Savannah River National Laboratory (SRNL). This study focuses on the RadBall{trademark} testing and modeling accomplished at SRNL.

  16. Caries risk assessment in school children using a reduced Cariogram model without saliva tests

    DEFF Research Database (Denmark)

    Petersson, Gunnel Hänsel; Isberg, Per-Erik; Twetman, Svante

    2010-01-01

    To investigate the caries predictive ability of a reduced Cariogram model without salivary tests in schoolchildren.......To investigate the caries predictive ability of a reduced Cariogram model without salivary tests in schoolchildren....

  17. Development of Vehicle Model Test for Road Loading Analysis of Sedan Model

    Science.gov (United States)

    Mohd Nor, M. K.; Noordin, A.; Ruzali, M. F. S.; Hussen, M. H.

    2016-11-01

    Simple Structural Surfaces (SSS) method is offered as a means of organizing the process for rationalizing the basic vehicle body structure load paths. The application of this simplified approach is highly beneficial in the design development of modern passenger car structure especially during the conceptual stage. In Malaysia, however, there is no real physical model of SSS available to gain considerable insight and understanding into the function of each major subassembly in the whole vehicle structures. Based on this motivation, a physical model of SSS for sedan model with the corresponding model vehicle tests of bending and torsion is proposed in this work. The proposed approach is relatively easy to understand as compared to Finite Element Method (FEM). The results show that the proposed vehicle model test is capable to show that satisfactory load paths can give a sufficient structural stiffness within the vehicle structure. It is clearly observed that the global bending stiffness reduce significantly when more panels are removed from a complete SSS model. It is identified that parcel shelf is an important subassembly to sustain bending load. The results also match with the theoretical hypothesis, as the stiffness of the structure in an open section condition is shown weak when subjected to torsion load compared to bending load. The proposed approach can potentially be integrated with FEM to speed up the design process of automotive vehicle.

  18. The Feasibility of a Diagnostic Media Test System Model.

    Science.gov (United States)

    Rapp, Alfred V.

    Research investigated the feasibility of a diagnostic media test system. Two distinct tests were developed for sixth grade and university populations, each having: 1) a main phase with three specific teaching sequences, one for each media form; 2) test items for each teaching sequence; and 3) a validation phase with one teaching sequence…

  19. A blast absorber test: measurement and model results

    NARCIS (Netherlands)

    Eerden, F.J.M. van der; Berg, F. van den; Hof, J. van 't; Arkel, E. van

    2006-01-01

    A blast absorber test was conducted at the Aberdeen Test Centre from 13 to 17 June 2005. The test was set up to determine the absorbing and shielding effect of a gravel pile, of 1.5 meters high and 15 by 15 meters wide, on blasts from large weapons: e.g. armor, artillery or demolition. The blast was

  20. The Global Modeling Initiative Assessment Model: Model Description, Integration and Testing of the Transport Shell

    Energy Technology Data Exchange (ETDEWEB)

    Rotman, D.A.; Tannahill, J.R.; Kinnison, D.E.; Connell, P.S.; Bergmann, D.; Proctor, D.; Rodriquez, J.M.; Lin, S.J.; Rood, R.B.; Prather, M.J.; Rasch, P.J.; Considine, D.B.; Ramaroson, R.; Kawa, S.R.

    2000-04-25

    We describe the three dimensional global stratospheric chemistry model developed under the NASA Global Modeling Initiative (GMI) to assess the possible environmental consequences from the emissions of a fleet of proposed high speed civil transport aircraft. This model was developed through a unique collaboration of the members of the GMI team. Team members provided computational modules representing various physical and chemical processes, and analysis of simulation results through extensive comparison to observation. The team members' modules were integrated within a computational framework that allowed transportability and simulations on massively parallel computers. A unique aspect of this model framework is the ability to interchange and intercompare different submodules to assess the sensitivity of numerical algorithms and model assumptions to simulation results. In this paper, we discuss the important attributes of the GMI effort, describe the GMI model computational framework and the numerical modules representing physical and chemical processes. As an application of the concept, we illustrate an analysis of the impact of advection algorithms on the dispersion of a NO{sub y}-like source in the stratosphere which mimics that of a fleet of commercial supersonic transports (High-Speed Civil Transport (HSCT)) flying between 17 and 20 kilometers.

  1. Modelling QTL effect on BTA06 using random regression test day models.

    Science.gov (United States)

    Suchocki, T; Szyda, J; Zhang, Q

    2013-02-01

    In statistical models, a quantitative trait locus (QTL) effect has been incorporated either as a fixed or as a random term, but, up to now, it has been mainly considered as a time-independent variable. However, for traits recorded repeatedly, it is very interesting to investigate the variation of QTL over time. The major goal of this study was to estimate the position and effect of QTL for milk, fat, protein yields and for somatic cell score based on test day records, while testing whether the effects are constant or variable throughout lactation. The analysed data consisted of 23 paternal half-sib families (716 daughters of 23 sires) of Chinese Holstein-Friesian cattle genotyped at 14 microsatellites located in the area of the casein loci on BTA6. A sequence of three models was used: (i) a lactation model, (ii) a random regression model with a QTL constant in time and (iii) a random regression model with a QTL variable in time. The results showed that, for each production trait, at least one significant QTL exists. For milk and protein yields, the QTL effect was variable in time, while for fat yield, each of the three models resulted in a significant QTL effect. When a QTL is incorporated into a model as a constant over time, its effect is averaged over lactation stages and may, thereby, be difficult or even impossible to be detected. Our results showed that, in such a situation, only a longitudinal model is able to identify loci significantly influencing trait variation.

  2. Ware Star - Scale 1:40 model test, test report 2; Wave Star - Skala 1:40 modelforsoeg, forsoegsrapport 2

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, M.; Lykke Andersen, Thomas

    2005-01-01

    This report describes model tests with the wave energy converter Wave Star carried out at Aalborg University. This report succeeds to reports presenting numerical calculations. The objective of the tests presented in this report is to determine and optimize the Wave Star concept's power uptake for different physical configurations of the converter. (BA)

  3. Test of the classic model for predicting endurance running performance.

    Science.gov (United States)

    McLaughlin, James E; Howley, Edward T; Bassett, David R; Thompson, Dixie L; Fitzhugh, Eugene C

    2010-05-01

    To compare the classic physiological variables linked to endurance performance (VO2max, %VO2max at lactate threshold (LT), and running economy (RE)) with peak treadmill velocity (PTV) as predictors of performance in a 16-km time trial. Seventeen healthy, well-trained distance runners (10 males and 7 females) underwent laboratory testing to determine maximal oxygen uptake (VO2max), RE, percentage of maximal oxygen uptake at the LT (%VO2max at LT), running velocity at LT, and PTV. Velocity at VO2max (vVO2max) was calculated from RE and VO2max. Three stepwise regression models were used to determine the best predictors (classic vs treadmill performance protocols) for the 16-km running time trial. Simple Pearson correlations of the variables with 16-km performance showed vVO2max to have the highest correlation (r = -0.972) and %VO2max at the LT the lowest (r = 0.136). The correlation coefficients for LT, VO2max, and PTV were very similar in magnitude (r = -0.903 to r = -0.892). When VO2max, %VO2max at LT, RE, and PTV were entered into SPSS stepwise analysis, VO2max explained 81.3% of the total variance, and RE accounted for an additional 10.7%. vVO2max was shown to be the best predictor of the 16-km performance, accounting for 94.4% of the total variance. The measured velocity at VO2max (PTV) was highly correlated with the estimated velocity at vVO2max (r = 0.8867). Among well-trained subjects heterogeneous in VO2max and running performance, vVO2max is the best predictor of running performance because it integrates both maximal aerobic power and the economy of running. The PTV is linked to the same physiological variables that determine vVO2max.

  4. Modeling Wood Encroachment in Abandoned Grasslands in the Eifel National Park - Model Description and Testing.

    Directory of Open Access Journals (Sweden)

    Silvana Hudjetz

    Full Text Available The degradation of natural and semi-natural landscapes has become a matter of global concern. In Germany, semi-natural grasslands belong to the most species-rich habitat types but have suffered heavily from changes in land use. After abandonment, the course of succession at a specific site is often difficult to predict because many processes interact. In order to support decision making when managing semi-natural grasslands in the Eifel National Park, we built the WoodS-Model (Woodland Succession Model. A multimodeling approach was used to integrate vegetation dynamics in both the herbaceous and shrub/tree layer. The cover of grasses and herbs was simulated in a compartment model, whereas bushes and trees were modelled in an individual-based manner. Both models worked and interacted in a spatially explicit, raster-based landscape. We present here the model description, parameterization and testing. We show highly detailed projections of the succession of a semi-natural grassland including the influence of initial vegetation composition, neighborhood interactions and ungulate browsing. We carefully weighted the single processes against each other and their relevance for landscape development under different scenarios, while explicitly considering specific site conditions. Model evaluation revealed that the model is able to emulate successional patterns as observed in the field as well as plausible results for different population densities of red deer. Important neighborhood interactions such as seed dispersal, the protection of seedlings from browsing ungulates by thorny bushes, and the inhibition of wood encroachment by the herbaceous layer, have been successfully reproduced. Therefore, not only a detailed model but also detailed initialization turned out to be important for spatially explicit projections of a given site. The advantage of the WoodS-Model is that it integrates these many mutually interacting processes of succession.

  5. Penetration Testing Professional Ethics: a conceptual model and taxonomy

    OpenAIRE

    Justin Pierce; Ashley Jones; Matthew Warren

    2006-01-01

    In an environment where commercial software is continually patched to correct security flaws, penetration testing can provide organisations with a realistic assessment of their security posture. Penetration testing uses the same principles as criminal hackers to penetrate corporate networks and thereby verify the presence of software vulnerabilities. Network administrators can use the results of a penetration test to correct flaws and improve overall security. The use of hacking techniques...

  6. 2-D Model Test Study of the Breakwater at Porto de Dande , Angola

    DEFF Research Database (Denmark)

    Andersen, Thomas Lykke; Ramirez, Jorge Robert Rodriguez; Burcharth, Hans F.

    This report deals with a two-dimensional model test study of the new breakwater at Porto de Dande, Angola. One cross-section was tested for stability and overtopping in various sea conditions. The length scale used for the model tests was 1:32. Unless otherwise specified all values given in this ......This report deals with a two-dimensional model test study of the new breakwater at Porto de Dande, Angola. One cross-section was tested for stability and overtopping in various sea conditions. The length scale used for the model tests was 1:32. Unless otherwise specified all values given...

  7. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuhn, William L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rector, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Heredia-Langner, Alejandro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  8. Testing an aetiological model of visual hallucinations in Parkinson's disease.

    Science.gov (United States)

    Gallagher, David A; Parkkinen, Laura; O'Sullivan, Sean S; Spratt, Alexander; Shah, Ameet; Davey, Clare C; Bremner, Fion D; Revesz, Tamas; Williams, David R; Lees, Andrew J; Schrag, Anette

    2011-11-01

    .026), autonomic function (P = 0.004), frontal cognitive function (P = 0.020) and a test of visuoperceptive function (object decision; P = 0.031). In a separate study, post-mortem analysis was performed in 91 subjects (mean age at death 75.5 ± 8.0 years) and persistent visual hallucinations were documented in 63%. Patients in the visual hallucinations group had similar disease duration but had significantly higher Lewy body densities in the middle frontal (P = 0.002) and middle temporal gyri (P = 0.033) and transentorhinal (P = 0.005) and anterior cingulate (P = 0.020) cortices but not parietal cortex (P = 0.22). Using a comprehensive assessment of the clinical, demographic and ophthalmological correlates of visual hallucinations in Parkinson's disease, the combined data support the hypothesized model of impaired visual processing, sleep-wake dysregulation and brainstem dysfunction, and cognitive, particularly frontal, impairment all independently contributing to the pathogenesis of visual hallucinations in Parkinson's disease. These clinical data are supported by the pathological study, in which higher overall cortical Lewy body counts, and in particular areas implicated in visuoperception and executive function, were associated with visual hallucinations.

  9. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  10. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  11. Some Useful Cost-Benefit Criteria for Evaluating Computer-Based Test Delivery Models and Systems

    Science.gov (United States)

    Luecht, Richard M.

    2005-01-01

    Computer-based testing (CBT) is typically implemented using one of three general test delivery models: (1) multiple fixed testing (MFT); (2) computer-adaptive testing (CAT); or (3) multistage testing (MSTs). This article reviews some of the real cost drivers associated with CBT implementation--focusing on item production costs, the costs…

  12. Modeling and Simulation of Microcode-based Built-In Self Test for Multi-Operation Memory Test Algorithms

    Directory of Open Access Journals (Sweden)

    R. K. Sharma

    2010-05-01

    Full Text Available As embedded memory area on-chip is increasing and memory density is growing, newer test algorithms like March SS are defined to detect newly developing faults. These new March algorithms contain multiple operations per March element. This paper presents a microcoded BIST architecture which can implement these new March tests having number of operations per element according to the growing needs of embedded memory testing. This is shown by implementing March SS Test and testing for new faults including Write Disturb Fault (WDF, Transition Coupling Fault (Cft, Deceptive Read Disturb Coupling Fault (Cfdrd, which established tests like March C- are not capable of detecting. Verilog HDL code of this architecture is written and synthesized using Xilinx ISE 8.2i. Verification of the architecture is done by testing Mentor's ModelSim.

  13. Models of little Higgs and electroweak precision tests

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Mu-Chun; /Fermilab

    2006-01-01

    The little Higgs idea is an alternative to supersymmetry as a solution to the gauge hierarchy problem. In this note, the author reviews various little Higgs models and their phenomenology with emphasis on the precision electroweak constraints in these models.

  14. Model Test Based Soil Spring Model and Application in Pipeline Thermal Buckling Analysis

    Institute of Scientific and Technical Information of China (English)

    GAO Xi-feng; LIU Run; YAN Shu-wang

    2011-01-01

    The buckling of submarine pipelines may occur due to the action of axial soil frictional force caused by relative movement of soil and pipeline,which is induced by the thermal and internal pressure.The likelihood of occurrence of this buckling phenomenon is largely determined by soil resistance.A series of large-scale model tests were carried out to facilitate the establishment of substantial data base for a variety of burial pipeline relationships.Based on the test data,nonlinear soil spring can be adopted to simulate the soil behavior during the pipeline movement.For uplift resistance,an ideal elasticity plasticity model is recommended in the case of H/D (depth-to-diameter ratio)>5 and an elasticity softened model is recommended in the case of H/D≤5.The soil resistance along the pipeline axial direction can be simulated by an ideal elasticity plasticity model.The numerical analyzing results show that the capacity of pipeline against thermal buckling decreases with its initial imperfection enlargement and increases with the burial depth enhancement.

  15. A Validation Process for the Groundwater Flow and Transport Model of the Faultless Nuclear Test at Central Nevada Test Area

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan

    2003-01-01

    Many sites of groundwater contamination rely heavily on complex numerical models of flow and transport to develop closure plans. This has created a need for tools and approaches that can be used to build confidence in model predictions and make it apparent to regulators, policy makers, and the public that these models are sufficient for decision making. This confidence building is a long-term iterative process and it is this process that should be termed ''model validation.'' Model validation is a process not an end result. That is, the process of model validation cannot always assure acceptable prediction or quality of the model. Rather, it provides safeguard against faulty models or inadequately developed and tested models. Therefore, development of a systematic approach for evaluating and validating subsurface predictive models and guiding field activities for data collection and long-term monitoring is strongly needed. This report presents a review of model validation studies that pertain to groundwater flow and transport modeling. Definitions, literature debates, previously proposed validation strategies, and conferences and symposia that focused on subsurface model validation are reviewed and discussed. The review is general in nature, but the focus of the discussion is on site-specific, predictive groundwater models that are used for making decisions regarding remediation activities and site closure. An attempt is made to compile most of the published studies on groundwater model validation and assemble what has been proposed or used for validating subsurface models. The aim is to provide a reasonable starting point to aid the development of the validation plan for the groundwater flow and transport model of the Faultless nuclear test conducted at the Central Nevada Test Area (CNTA). The review of previous studies on model validation shows that there does not exist a set of specific procedures and tests that can be easily adapted and

  16. Scalable Power-Component Models for Concept Testing

    Science.gov (United States)

    2011-08-16

    Technology: Permanent Magnet Brushless DC machine • Model : Self- generating torque-speed-efficiency map • Future improvements: Induction machine...unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Outline • Motivation and Scope • Integrated Starter Generator Model • Battery Model ...and systems engineering. • Scope: Scalable, generic MATLAB/Simulink models in three areas: – Electromechanical machines (Integrated Starter

  17. Testing of a coupled model of the HBV model and a glacier retreat model on a Himalayan basin

    Science.gov (United States)

    LI, Hong; Xu, Chongyu; Beldring, Stein; Melvold, Kjetil; Jain, Sharad

    2014-05-01

    The Himalayan glaciers are source of numerous large Asian river systems, including the Indus, Ganges, and Brahmaputra, which provide water for 1.5 billion people. This region is among areas that are the most sensitive to climate change. Shrinking of the glaciers is expected to significantly affect hydrologic responses of glaciated basins. Retreat of glaciers in these basins is predicted to cause severe water crisis in these basins. However, glacier behaviours are not well represented in most current hydrological models. The objective of the present study is to test performance of a coupled model consisting of a hydrological model and a glacier retreat model. The hydrological model is a distributed HBV model, simulating runoff response to water input into catchment. The glacier retreat model is a distributed glacier-specific model, Δh-parameterization describing ice redistribution caused by glacier movement. The Beas River basin in the Northern India is selected as focus area because of its high representativeness of the Himalayan basins and availability of data. This study will not only improve the HBV model for hydrological studies in glaciated catchments, but also contribute to improved understanding and modelling of glacier hydrology. The coupled model will be a useful tool for water resources projections and hydropower planning in a far future on highly glaciated basins.

  18. Comparative evaluation of forced swim test and tail suspension test as models of negative symptom of schizophrenia in rodents.

    Science.gov (United States)

    Chatterjee, Manavi; Jaiswal, Manoj; Palit, Gautam

    2012-01-01

    Previous studies have shown that the administration of NMDA antagonist can induce negative symptoms of schizophrenia which can be tested through the enhanced immobility observed in the forced swim test (FST). In the present study, we have compared the effects of acute as well as chronic administration of a noncompetitive NMDA receptor antagonist, ketamine on FST, and another behaviour despair model, tail suspension test (TST). Our observations suggest that chronic ketamine administration induced a state of enhanced immobility in FST, but such findings were not replicated in the TST model. Further, in FST, treatment with clozapine reverses the ketamine-induced immobility in mice, whereas it enhances the immobility duration in the TST model. However, haloperidol showed no protective effects in both models. The data suggests that although both of these tests show common behavioural measure of feeling despair, however, the underlying pathophysiology seems to be different. Hence, forced swim test but not tail suspension test can be used as a model of negative symptom of psychosis in mice.

  19. Testing and Inference in Nonlinear Cointegrating Vector Error Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    for linearity is of particular interest as parameters of non-linear components vanish under the null. To solve the latter type of testing, we use the so-called sup tests, which here requires development of new (uniform) weak convergence results. These results are potentially useful in general for analysis...

  20. Testing and inference in nonlinear cointegrating vector error correction models

    DEFF Research Database (Denmark)

    Kristensen, D.; Rahbek, A.

    2013-01-01

    the null of linearity, parameters of nonlinear components vanish, leading to a nonstandard testing problem. We apply so-called sup-tests to resolve this issue, which requires development of new(uniform) functional central limit theory and results for convergence of stochastic integrals. We provide a full...