WorldWideScience

Sample records for anova models application

  1. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  2. Application of one-way ANOVA in completely randomized experiments

    Science.gov (United States)

    Wahid, Zaharah; Izwan Latiff, Ahmad; Ahmad, Kartini

    2017-12-01

    This paper describes an application of a statistical technique one-way ANOVA in completely randomized experiments with three replicates. This technique was employed to a single factor with four levels and multiple observations at each level. The aim of this study is to investigate the relationship between chemical oxygen demand index and location on-sites. Two different approaches are employed for the analyses; critical value and p-value. It also presents key assumptions of the technique to be satisfied by the data in order to obtain valid results. Pairwise comparisons by Turkey method are also considered and discussed to determine where the significant differences among the means is after the ANOVA has been performed. The results revealed that there are statistically significant relationship exist between the chemical oxygen demand index and the location on-sites.

  3. ANOVA and ANCOVA A GLM Approach

    CERN Document Server

    Rutherford, Andrew

    2012-01-01

    Provides an in-depth treatment of ANOVA and ANCOVA techniques from a linear model perspective ANOVA and ANCOVA: A GLM Approach provides a contemporary look at the general linear model (GLM) approach to the analysis of variance (ANOVA) of one- and two-factor psychological experiments. With its organized and comprehensive presentation, the book successfully guides readers through conventional statistical concepts and how to interpret them in GLM terms, treating the main single- and multi-factor designs as they relate to ANOVA and ANCOVA. The book begins with a brief history of the separate dev

  4. INFLUENCE OF TECHNOLOGICAL PARAMETERS ON AGROTEXTILES WATER ABSORBENCY USING ANOVA MODEL

    Directory of Open Access Journals (Sweden)

    LUPU Iuliana G.

    2016-05-01

    Full Text Available Agrotextiles are now days extensively being used in horticulture, farming and other agricultural activities. Agriculture and textiles are the largest industries in the world providing basic needs such as food and clothing. Agrotextiles plays a significant role to help control environment for crop protection, eliminate variations in climate, weather change and generate optimum condition for plant growth. Water absorptive capacity is a very important property of needle-punched nonwovens used as irrigation substrate in horticulture. Nonwovens used as watering substrate distribute water uniformly and act as slight water buffer owing to the absorbent capacity. The paper analyzes the influence of needling process parameters on water absorptive capacity of needle-punched nonwovens by using ANOVA model. The model allows the identification of optimal action parameters in a shorter time and with less material expenses than by experimental research. The frequency of needle board and needle depth penetration has been used as independent variables while the water absorptive capacity as dependent variable for ANOVA regression model. Based on employed ANOVA model we have established that there is a significant influence of needling parameters on water absorbent capacity. The higher of depth needle penetration and needle board frequency, the higher is the compactness of fabric. A less porous structure has a lower water absorptive capacity.

  5. Visualizing Experimental Designs for Balanced ANOVA Models using Lisp-Stat

    Directory of Open Access Journals (Sweden)

    Philip W. Iversen

    2004-12-01

    Full Text Available The structure, or Hasse, diagram described by Taylor and Hilton (1981, American Statistician provides a visual display of the relationships between factors for balanced complete experimental designs. Using the Hasse diagram, rules exist for determining the appropriate linear model, ANOVA table, expected means squares, and F-tests in the case of balanced designs. This procedure has been implemented in Lisp-Stat using a software representation of the experimental design. The user can interact with the Hasse diagram to add, change, or delete factors and see the effect on the proposed analysis. The system has potential uses in teaching and consulting.

  6. An adaptive ANOVA-based PCKF for high-dimensional nonlinear inverse modeling

    Science.gov (United States)

    Li, Weixuan; Lin, Guang; Zhang, Dongxiao

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect-except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos basis functions in the expansion helps to capture uncertainty more accurately but increases computational cost. Selection of basis functions is particularly important for high-dimensional stochastic problems because the number of polynomial chaos basis functions required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE basis functions are pre-set based on users' experience. Also, for sequential data assimilation problems, the basis functions kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE basis functions for different problems and automatically adjusts the number of basis functions in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm was tested with different examples and demonstrated

  7. Biomarker Detection in Association Studies: Modeling SNPs Simultaneously via Logistic ANOVA

    KAUST Repository

    Jung, Yoonsuh

    2014-10-02

    In genome-wide association studies, the primary task is to detect biomarkers in the form of Single Nucleotide Polymorphisms (SNPs) that have nontrivial associations with a disease phenotype and some other important clinical/environmental factors. However, the extremely large number of SNPs comparing to the sample size inhibits application of classical methods such as the multiple logistic regression. Currently the most commonly used approach is still to analyze one SNP at a time. In this paper, we propose to consider the genotypes of the SNPs simultaneously via a logistic analysis of variance (ANOVA) model, which expresses the logit transformed mean of SNP genotypes as the summation of the SNP effects, effects of the disease phenotype and/or other clinical variables, and the interaction effects. We use a reduced-rank representation of the interaction-effect matrix for dimensionality reduction, and employ the L 1-penalty in a penalized likelihood framework to filter out the SNPs that have no associations. We develop a Majorization-Minimization algorithm for computational implementation. In addition, we propose a modified BIC criterion to select the penalty parameters and determine the rank number. The proposed method is applied to a Multiple Sclerosis data set and simulated data sets and shows promise in biomarker detection.

  8. Biomarker Detection in Association Studies: Modeling SNPs Simultaneously via Logistic ANOVA

    KAUST Repository

    Jung, Yoonsuh; Huang, Jianhua Z.; Hu, Jianhua

    2014-01-01

    In genome-wide association studies, the primary task is to detect biomarkers in the form of Single Nucleotide Polymorphisms (SNPs) that have nontrivial associations with a disease phenotype and some other important clinical/environmental factors. However, the extremely large number of SNPs comparing to the sample size inhibits application of classical methods such as the multiple logistic regression. Currently the most commonly used approach is still to analyze one SNP at a time. In this paper, we propose to consider the genotypes of the SNPs simultaneously via a logistic analysis of variance (ANOVA) model, which expresses the logit transformed mean of SNP genotypes as the summation of the SNP effects, effects of the disease phenotype and/or other clinical variables, and the interaction effects. We use a reduced-rank representation of the interaction-effect matrix for dimensionality reduction, and employ the L 1-penalty in a penalized likelihood framework to filter out the SNPs that have no associations. We develop a Majorization-Minimization algorithm for computational implementation. In addition, we propose a modified BIC criterion to select the penalty parameters and determine the rank number. The proposed method is applied to a Multiple Sclerosis data set and simulated data sets and shows promise in biomarker detection.

  9. Backfitting in Smoothing Spline Anova, with Application to Historical Global Temperature Data

    Science.gov (United States)

    Luo, Zhen

    In the attempt to estimate the temperature history of the earth using the surface observations, various biases can exist. An important source of bias is the incompleteness of sampling over both time and space. There have been a few methods proposed to deal with this problem. Although they can correct some biases resulting from incomplete sampling, they have ignored some other significant biases. In this dissertation, a smoothing spline ANOVA approach which is a multivariate function estimation method is proposed to deal simultaneously with various biases resulting from incomplete sampling. Besides that, an advantage of this method is that we can get various components of the estimated temperature history with a limited amount of information stored. This method can also be used for detecting erroneous observations in the data base. The method is illustrated through an example of modeling winter surface air temperature as a function of year and location. Extension to more complicated models are discussed. The linear system associated with the smoothing spline ANOVA estimates is too large to be solved by full matrix decomposition methods. A computational procedure combining the backfitting (Gauss-Seidel) algorithm and the iterative imputation algorithm is proposed. This procedure takes advantage of the tensor product structure in the data to make the computation feasible in an environment of limited memory. Various related issues are discussed, e.g., the computation of confidence intervals and the techniques to speed up the convergence of the backfitting algorithm such as collapsing and successive over-relaxation.

  10. Adaptive surrogate modeling by ANOVA and sparse polynomial dimensional decomposition for global sensitivity analysis in fluid simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Kunkun, E-mail: ktg@illinois.edu [The Center for Exascale Simulation of Plasma-Coupled Combustion (XPACC), University of Illinois at Urbana–Champaign, 1308 W Main St, Urbana, IL 61801 (United States); Inria Bordeaux – Sud-Ouest, Team Cardamom, 200 avenue de la Vieille Tour, 33405 Talence (France); Congedo, Pietro M. [Inria Bordeaux – Sud-Ouest, Team Cardamom, 200 avenue de la Vieille Tour, 33405 Talence (France); Abgrall, Rémi [Institut für Mathematik, Universität Zürich, Winterthurerstrasse 190, CH-8057 Zürich (Switzerland)

    2016-06-01

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable for real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.

  11. Adaptive surrogate modeling by ANOVA and sparse polynomial dimensional decomposition for global sensitivity analysis in fluid simulation

    International Nuclear Information System (INIS)

    Tang, Kunkun; Congedo, Pietro M.; Abgrall, Rémi

    2016-01-01

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable for real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.

  12. ANOVA-principal component analysis and ANOVA-simultaneous component analysis: a comparison.

    NARCIS (Netherlands)

    Zwanenburg, G.; Hoefsloot, H.C.J.; Westerhuis, J.A.; Jansen, J.J.; Smilde, A.K.

    2011-01-01

    ANOVA-simultaneous component analysis (ASCA) is a recently developed tool to analyze multivariate data. In this paper, we enhance the explorative capability of ASCA by introducing a projection of the observations on the principal component subspace to visualize the variation among the measurements.

  13. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn [School of Information Science and Technology, ShanghaiTech University, Shanghai 200031 (China); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics & School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  14. ANOVA for the behavioral sciences researcher

    CERN Document Server

    Cardinal, Rudolf N

    2013-01-01

    This new book provides a theoretical and practical guide to analysis of variance (ANOVA) for those who have not had a formal course in this technique, but need to use this analysis as part of their research.From their experience in teaching this material and applying it to research problems, the authors have created a summary of the statistical theory underlying ANOVA, together with important issues, guidance, practical methods, references, and hints about using statistical software. These have been organized so that the student can learn the logic of the analytical techniques but also use the

  15. Sequential experimental design based generalised ANOVA

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    2016-07-15

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  16. ANOVA-HDMR structure of the higher order nodal diffusion solution

    International Nuclear Information System (INIS)

    Bokov, P. M.; Prinsloo, R. H.; Tomasevic, D. I.

    2013-01-01

    Nodal diffusion methods still represent a standard in global reactor calculations, but employ some ad-hoc approximations (such as the quadratic leakage approximation) which limit their accuracy in cases where reference quality solutions are sought. In this work we solve the nodal diffusion equations utilizing the so-called higher-order nodal methods to generate reference quality solutions and to decompose the obtained solutions via a technique known as High Dimensional Model Representation (HDMR). This representation and associated decomposition of the solution provides a new formulation of the transverse leakage term. The HDMR structure is investigated via the technique of Analysis of Variance (ANOVA), which indicates why the existing class of transversely-integrated nodal methods prove to be so successful. Furthermore, the analysis leads to a potential solution method for generating reference quality solutions at a much reduced calculational cost, by applying the ANOVA technique to the full higher order solution. (authors)

  17. Constrained statistical inference: sample-size tables for ANOVA and regression

    Directory of Open Access Journals (Sweden)

    Leonard eVanbrabant

    2015-01-01

    Full Text Available Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient beta1 is larger than beta2 and beta3. The corresponding hypothesis is H: beta1 > {beta2, beta3} and this is known as an (order constrained hypothesis. A major advantage of testing such a hypothesis is that power can be gained and inherently a smaller sample size is needed. This article discusses this gain in sample size reduction, when an increasing number of constraints is included into the hypothesis. The main goal is to present sample-size tables for constrained hypotheses. A sample-size table contains the necessary sample-size at a prespecified power (say, 0.80 for an increasing number of constraints. To obtain sample-size tables, two Monte Carlo simulations were performed, one for ANOVA and one for multiple regression. Three results are salient. First, in an ANOVA the needed sample-size decreases with 30% to 50% when complete ordering of the parameters is taken into account. Second, small deviations from the imposed order have only a minor impact on the power. Third, at the maximum number of constraints, the linear regression results are comparable with the ANOVA results. However, in the case of fewer constraints, ordering the parameters (e.g., beta1 > beta2 results in a higher power than assigning a positive or a negative sign to the parameters (e.g., beta1 > 0.

  18. Permutation Tests for Stochastic Ordering and ANOVA

    CERN Document Server

    Basso, Dario; Salmaso, Luigi; Solari, Aldo

    2009-01-01

    Permutation testing for multivariate stochastic ordering and ANOVA designs is a fundamental issue in many scientific fields such as medicine, biology, pharmaceutical studies, engineering, economics, psychology, and social sciences. This book presents advanced methods and related R codes to perform complex multivariate analyses

  19. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  20. Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism

    OpenAIRE

    Arias-Castro, Ery; Candès, Emmanuel J.; Plan, Yaniv

    2011-01-01

    Testing for the significance of a subset of regression coefficients in a linear model, a staple of statistical analysis, goes back at least to the work of Fisher who introduced the analysis of variance (ANOVA). We study this problem under the assumption that the coefficient vector is sparse, a common situation in modern high-dimensional settings. Suppose we have $p$ covariates and that under the alternative, the response only depends upon the order of $p^{1-\\alpha}$ of those, $0\\le\\alpha\\le1$...

  1. Effect of fasting ramadan in diabetes control status - application of extensive diabetes education, serum creatinine with HbA1c statistical ANOVA and regression models to prevent hypoglycemia.

    Science.gov (United States)

    Aziz, Kamran M A

    2013-09-01

    Ramadan fasting is an obligatory duty for Muslims. Unique physiologic and metabolic changes occur during fasting which requires adjustments of diabetes medications. Although challenging, successful fasting can be accomplished if pre-Ramadan extensive education is provided to the patients. Current research was conducted to study effective Ramadan fasting with different OHAs/insulins without significant risk of hypoglycemia in terms of HbA1c reductions after Ramadan. ANOVA model was used to assess HbA1c levels among different education statuses. Serum creatinine was used to measure renal functions. Pre-Ramadan diabetes education with alteration of therapy and dosage adjustments for OHAs/insulin was done. Regression models for HbA1c before Ramadan with FBS before sunset were also synthesized as a tool to prevent hypoglycemia and successful Ramadan fasting in future. Out of 1046 patients, 998 patients fasted successfully without any episodes of hypoglycemia. 48 patients (4.58%) experienced hypoglycemia. Χ(2) Test for CRD/CKD with hypoglycemia was also significant (p-value Ramadan diabetes management. Some relevant patents are also outlined in this paper.

  2. An ANOVA approach for statistical comparisons of brain networks.

    Science.gov (United States)

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  3. Group-wise ANOVA simultaneous component analysis for designed omics experiments

    NARCIS (Netherlands)

    Saccenti, Edoardo; Smilde, Age K.; Camacho, José

    2018-01-01

    Introduction: Modern omics experiments pertain not only to the measurement of many variables but also follow complex experimental designs where many factors are manipulated at the same time. This data can be conveniently analyzed using multivariate tools like ANOVA-simultaneous component analysis

  4. Violation of the Sphericity Assumption and Its Effect on Type-I Error Rates in Repeated Measures ANOVA and Multi-Level Linear Models (MLM).

    Science.gov (United States)

    Haverkamp, Nicolas; Beauducel, André

    2017-01-01

    We investigated the effects of violations of the sphericity assumption on Type I error rates for different methodical approaches of repeated measures analysis using a simulation approach. In contrast to previous simulation studies on this topic, up to nine measurement occasions were considered. Effects of the level of inter-correlations between measurement occasions on Type I error rates were considered for the first time. Two populations with non-violation of the sphericity assumption, one with uncorrelated measurement occasions and one with moderately correlated measurement occasions, were generated. One population with violation of the sphericity assumption combines uncorrelated with highly correlated measurement occasions. A second population with violation of the sphericity assumption combines moderately correlated and highly correlated measurement occasions. From these four populations without any between-group effect or within-subject effect 5,000 random samples were drawn. Finally, the mean Type I error rates for Multilevel linear models (MLM) with an unstructured covariance matrix (MLM-UN), MLM with compound-symmetry (MLM-CS) and for repeated measures analysis of variance (rANOVA) models (without correction, with Greenhouse-Geisser-correction, and Huynh-Feldt-correction) were computed. To examine the effect of both the sample size and the number of measurement occasions, sample sizes of n = 20, 40, 60, 80, and 100 were considered as well as measurement occasions of m = 3, 6, and 9. With respect to rANOVA, the results plead for a use of rANOVA with Huynh-Feldt-correction, especially when the sphericity assumption is violated, the sample size is rather small and the number of measurement occasions is large. For MLM-UN, the results illustrate a massive progressive bias for small sample sizes ( n = 20) and m = 6 or more measurement occasions. This effect could not be found in previous simulation studies with a smaller number of measurement occasions. The

  5. Prediction and Control of Cutting Tool Vibration in Cnc Lathe with Anova and Ann

    Directory of Open Access Journals (Sweden)

    S. S. Abuthakeer

    2011-06-01

    Full Text Available Machining is a complex process in which many variables can deleterious the desired results. Among them, cutting tool vibration is the most critical phenomenon which influences dimensional precision of the components machined, functional behavior of the machine tools and life of the cutting tool. In a machining operation, the cutting tool vibrations are mainly influenced by cutting parameters like cutting speed, depth of cut and tool feed rate. In this work, the cutting tool vibrations are controlled using a damping pad made of Neoprene. Experiments were conducted in a CNC lathe where the tool holder is supported with and without damping pad. The cutting tool vibration signals were collected through a data acquisition system supported by LabVIEW software. To increase the buoyancy and reliability of the experiments, a full factorial experimental design was used. Experimental data collected were tested with analysis of variance (ANOVA to understand the influences of the cutting parameters. Empirical models have been developed using analysis of variance (ANOVA. Experimental studies and data analysis have been performed to validate the proposed damping system. Multilayer perceptron neural network model has been constructed with feed forward back-propagation algorithm using the acquired data. On the completion of the experimental test ANN is used to validate the results obtained and also to predict the behavior of the system under any cutting condition within the operating range. The onsite tests show that the proposed system reduces the vibration of cutting tool to a greater extend.

  6. Cautionary Note on Reporting Eta-Squared Values from Multifactor ANOVA Designs

    Science.gov (United States)

    Pierce, Charles A.; Block, Richard A.; Aguinis, Herman

    2004-01-01

    The authors provide a cautionary note on reporting accurate eta-squared values from multifactor analysis of variance (ANOVA) designs. They reinforce the distinction between classical and partial eta-squared as measures of strength of association. They provide examples from articles published in premier psychology journals in which the authors…

  7. Use of "t"-Test and ANOVA in Career-Technical Education Research

    Science.gov (United States)

    Rojewski, Jay W.; Lee, In Heok; Gemici, Sinan

    2012-01-01

    Use of t-tests and analysis of variance (ANOVA) procedures in published research from three scholarly journals in career and technical education (CTE) during a recent 5-year period was examined. Information on post hoc analyses, reporting of effect size, alpha adjustments to account for multiple tests, power, and examination of assumptions…

  8. Comparative study between EDXRF and ASTM E572 methods using two-way ANOVA

    Science.gov (United States)

    Krummenauer, A.; Veit, H. M.; Zoppas-Ferreira, J.

    2018-03-01

    Comparison with reference method is one of the necessary requirements for the validation of non-standard methods. This comparison was made using the experiment planning technique with two-way ANOVA. In ANOVA, the results obtained using the EDXRF method, to be validated, were compared with the results obtained using the ASTM E572-13 standard test method. Fisher's tests (F-test) were used to comparative study between of the elements: molybdenum, niobium, copper, nickel, manganese, chromium and vanadium. All F-tests of the elements indicate that the null hypothesis (Ho) has not been rejected. As a result, there is no significant difference between the methods compared. Therefore, according to this study, it is concluded that the EDXRF method was approved in this method comparison requirement.

  9. ANOVA Based Approch for Efficient Customer Recognition: Dealing with Common Names

    OpenAIRE

    Saberi , Morteza; Saberi , Zahra

    2015-01-01

    Part 2: Artificial Intelligence for Knowledge Management; International audience; This study proposes an Analysis of Variance (ANOVA) technique that focuses on the efficient recognition of customers with common names. The continuous improvement of Information and communications technologies (ICT) has led customers to have new expectations and concerns from their related organization. These new expectations bring various difficulties for organizations’ help desk to meet their customers’ needs....

  10. Foliar Sprays of Citric Acid and Malic Acid Modify Growth, Flowering, and Root to Shoot Ratio of Gazania (Gazania rigens L.: A Comparative Analysis by ANOVA and Structural Equations Modeling

    Directory of Open Access Journals (Sweden)

    Majid Talebi

    2014-01-01

    Full Text Available Foliar application of two levels of citric acid and malic acid (100 or 300 mg L−1 was investigated on flower stem height, plant height, flower performance and yield indices (fresh yield, dry yield and root to shoot ratio of Gazania. Distilled water was applied as control treatment. Multivariate analysis revealed that while the experimental treatments had no significant effect on fresh weight and the flower count, the plant dry weight was significantly increased by 300 mg L−1 malic acid. Citric acid at 100 and 300 mg L−1 and 300 mg L−1 malic acid increased the root fresh weight significantly. Both the plant height and peduncle length were significantly increased in all applied levels of citric acid and malic acid. The display time of flowers on the plant increased in all treatments compared to control treatment. The root to shoot ratio was increased significantly in 300 mg L−1 citric acid compared to all other treatments. These findings confirm earlier reports that citric acid and malic acid as environmentally sound chemicals are effective on various aspects of growth and development of crops. Structural equations modeling is used in parallel to ANOVA to conclude the factor effects and the possible path of effects.

  11. The Influencing Factor Analysis on the Performance Evaluation of Assembly Line Balancing Problem Level 1 (SALBP-1) Based on ANOVA Method

    Science.gov (United States)

    Chen, Jie; Hu, Jiangnan

    2017-06-01

    Industry 4.0 and lean production has become the focus of manufacturing. A current issue is to analyse the performance of the assembly line balancing. This study focus on distinguishing the factors influencing the assembly line balancing. The one-way ANOVA method is applied to explore the significant degree of distinguished factors. And regression model is built to find key points. The maximal task time (tmax ), the quantity of tasks (n), and degree of convergence of precedence graph (conv) are critical for the performance of assembly line balancing. The conclusion will do a favor to the lean production in the manufacturing.

  12. Development of a Mobile Application for Building Energy Prediction Using Performance Prediction Model

    Directory of Open Access Journals (Sweden)

    Yu-Ri Kim

    2016-03-01

    Full Text Available Recently, the Korean government has enforced disclosure of building energy performance, so that such information can help owners and prospective buyers to make suitable investment plans. Such a building energy performance policy of the government makes it mandatory for the building owners to obtain engineering audits and thereby evaluate the energy performance levels of their buildings. However, to calculate energy performance levels (i.e., asset rating methodology, a qualified expert needs to have access to at least the full project documentation and/or conduct an on-site inspection of the buildings. Energy performance certification costs a lot of time and money. Moreover, the database of certified buildings is still actually quite small. A need, therefore, is increasing for a simplified and user-friendly energy performance prediction tool for non-specialists. Also, a database which allows building owners and users to compare best practices is required. In this regard, the current study developed a simplified performance prediction model through experimental design, energy simulations and ANOVA (analysis of variance. Furthermore, using the new prediction model, a related mobile application was also developed.

  13. Optimum Combination and Effect Analysis of Piezoresistor Dimensions in Micro Piezoresistive Pressure Sensor Using Design of Experiments and ANOVA: a Taguchi Approach

    Directory of Open Access Journals (Sweden)

    Kirankumar B. Balavalad

    2017-04-01

    Full Text Available Piezoresistive (PZR pressure sensors have gained importance because of their robust construction, high sensitivity and good linearity. The conventional PZR pressure sensor consists of 4 piezoresistors placed on diaphragm and are connected in the form of Wheatstone bridge. These sensors convert stress applied on them into change in resistance, which is quantified into voltage using Wheatstone bridge mechanism. It is observed form the literature that, the dimensions of piezoresistors are very crucial in the performance of the piezoresistive pressure sensor. This paper presents, a novel mechanism of finding best combinations and effect of individual piezoresistors dimensions viz., Length, Width and Thickness, using DoE and ANOVA (Analysis of Variance method, following Taguchi experimentation approach. The paper presents a unique method to find optimum combination of piezoresistors dimensions and also clearly illustrates the effect the dimensions on the output of the sensor. The optimum combinations and the output response of sensor is predicted using DoE and the validation simulation is done. The result of the validation simulation is compared with the predicted value of sensor response i.e., V. Predicted value of V is 1.074 V and the validation simulation gave the response for V as 1.19 V. This actually validates that the model (DoE and ANOVA is adequate in describing V in terms of the variables defined.

  14. Why we should use simpler models if the data allow this: relevance for ANOVA designs in experimental biology

    Directory of Open Access Journals (Sweden)

    Lazic Stanley E

    2008-07-01

    Full Text Available Abstract Background Analysis of variance (ANOVA is a common statistical technique in physiological research, and often one or more of the independent/predictor variables such as dose, time, or age, can be treated as a continuous, rather than a categorical variable during analysis – even if subjects were randomly assigned to treatment groups. While this is not common, there are a number of advantages of such an approach, including greater statistical power due to increased precision, a simpler and more informative interpretation of the results, greater parsimony, and transformation of the predictor variable is possible. Results An example is given from an experiment where rats were randomly assigned to receive either 0, 60, 180, or 240 mg/L of fluoxetine in their drinking water, with performance on the forced swim test as the outcome measure. Dose was treated as either a categorical or continuous variable during analysis, with the latter analysis leading to a more powerful test (p = 0.021 vs. p = 0.159. This will be true in general, and the reasons for this are discussed. Conclusion There are many advantages to treating variables as continuous numeric variables if the data allow this, and this should be employed more often in experimental biology. Failure to use the optimal analysis runs the risk of missing significant effects or relationships.

  15. A Hybrid One-Way ANOVA Approach for the Robust and Efficient Estimation of Differential Gene Expression with Multiple Patterns.

    Directory of Open Access Journals (Sweden)

    Mohammad Manir Hossain Mollah

    Full Text Available Identifying genes that are differentially expressed (DE between two or more conditions with multiple patterns of expression is one of the primary objectives of gene expression data analysis. Several statistical approaches, including one-way analysis of variance (ANOVA, are used to identify DE genes. However, most of these methods provide misleading results for two or more conditions with multiple patterns of expression in the presence of outlying genes. In this paper, an attempt is made to develop a hybrid one-way ANOVA approach that unifies the robustness and efficiency of estimation using the minimum β-divergence method to overcome some problems that arise in the existing robust methods for both small- and large-sample cases with multiple patterns of expression.The proposed method relies on a β-weight function, which produces values between 0 and 1. The β-weight function with β = 0.2 is used as a measure of outlier detection. It assigns smaller weights (≥ 0 to outlying expressions and larger weights (≤ 1 to typical expressions. The distribution of the β-weights is used to calculate the cut-off point, which is compared to the observed β-weight of an expression to determine whether that gene expression is an outlier. This weight function plays a key role in unifying the robustness and efficiency of estimation in one-way ANOVA.Analyses of simulated gene expression profiles revealed that all eight methods (ANOVA, SAM, LIMMA, EBarrays, eLNN, KW, robust BetaEB and proposed perform almost identically for m = 2 conditions in the absence of outliers. However, the robust BetaEB method and the proposed method exhibited considerably better performance than the other six methods in the presence of outliers. In this case, the BetaEB method exhibited slightly better performance than the proposed method for the small-sample cases, but the the proposed method exhibited much better performance than the BetaEB method for both the small- and large

  16. Using the multiple regression analysis with respect to ANOVA and 3D mapping to model the actual performance of PEM (proton exchange membrane) fuel cell at various operating conditions

    International Nuclear Information System (INIS)

    Al-Hadeethi, Farqad; Al-Nimr, Moh'd; Al-Safadi, Mohammad

    2015-01-01

    The performance of PEM (proton exchange membrane) fuel cell was experimentally investigated at three temperatures (30, 50 and 70 °C), four flow rates (5, 10, 15 and 20 ml/min) and two flow patterns (co-current and counter current) in order to generate two correlations using multiple regression analysis with respect to ANOVA. Results revealed that increasing the temperature for co-current and counter current flow patterns will increase both hydrogen and oxygen diffusivities, water management and membrane conductivity. The derived mathematical correlations and three dimensional mapping (i.e. surface response) for the co-current and countercurrent flow patterns showed that there is a clear interaction among the various variables (temperatures and flow rates). - Highlights: • Generating mathematical correlations using multiple regression analysis with respect to ANOVA for the performance of the PEM fuel cell. • Using the 3D mapping to diagnose the optimum performance of the PEM fuel cell at the given operating conditions. • Results revealed that increasing the flow rate had direct influence on the consumption of oxygen. • Results assured that increasing the temperature in co-current and counter current flow patterns increases the performance of PEM fuel cell.

  17. The influence of speed abilities and technical skills in early adolescence on adult success in soccer: A long-term prospective analysis using ANOVA and SEM approaches

    Science.gov (United States)

    2017-01-01

    Several talent development programs in youth soccer have implemented motor diagnostics measuring performance factors. However, the predictive value of such tests for adult success is a controversial topic in talent research. This prospective cohort study evaluated the long-term predictive value of 1) motor tests and 2) players’ speed abilities (SA) and technical skills (TS) in early adolescence. The sample consisted of 14,178 U12 players from the German talent development program. Five tests (sprint, agility, dribbling, ball control, shooting) were conducted and players’ height, weight as well as relative age were assessed at nationwide diagnostics between 2004 and 2006. In the 2014/15 season, the players were then categorized as professional (n = 89), semi-professional (n = 913), or non-professional players (n = 13,176), indicating their adult performance level (APL). The motor tests’ prognostic relevance was determined using ANOVAs. Players’ future success was predicted by a logistic regression threshold model. This structural equation model comprised a measurement model with the motor tests and two correlated latent factors, SA and TS, with simultaneous consideration for the manifest covariates height, weight and relative age. Each motor predictor and anthropometric characteristic discriminated significantly between the APL (p < .001; η2 ≤ .02). The threshold model significantly predicted the APL (R2 = 24.8%), and in early adolescence the factor TS (p < .001) seems to have a stronger effect on adult performance than SA (p < .05). Both approaches (ANOVA, SEM) verified the diagnostics’ predictive validity over a long-term period (≈ 9 years). However, because of the limited effect sizes, the motor tests’ prognostic relevance remains ambiguous. A challenge for future research lies in the integration of different (e.g., person-oriented or multilevel) multivariate approaches that expand beyond the “traditional” topic of single tests’ predictive

  18. ANOVA IN MARKETING RESEARCH OF CONSUMER BEHAVIOR OF DIFFERENT CATEGORIES IN GEORGIAN MARKET

    Directory of Open Access Journals (Sweden)

    NUGZAR TODUA

    2015-03-01

    Full Text Available Consumer behavior research was conducted on bank services and (non-alcohol soft drinks. Based on four different currencies and ten services there are analyses made on bank clients’ distribution by bank services and currencies, percentage distribution by bank services, percentage distribution of bank services by currencies. Similar results are also received in case of ten soft drinks with their five characteristics: consumers quantities split by types of soft drinks and attributes; Attributes percentage split by types of soft drinks; Types of soft drinks percentage split by attributes. With usage of ANOVA, based on the marketing research outcomes it is concluded that bank clients’ total quantities i.e. populations’ unknown mean scores do not differ from each other. In the soft drinks research case consumers’ total quantities i.e. populations’ unknown mean scores vary by characteristics

  19. Teaching renewable energy using online PBL in investigating its effect on behaviour towards energy conservation among Malaysian students: ANOVA repeated measures approach

    Science.gov (United States)

    Nordin, Norfarah; Samsudin, Mohd Ali; Hadi Harun, Abdul

    2017-01-01

    This research aimed to investigate whether online problem based learning (PBL) approach to teach renewable energy topic improves students’ behaviour towards energy conservation. A renewable energy online problem based learning (REePBaL) instruction package was developed based on the theory of constructivism and adaptation of the online learning model. This study employed a single group quasi-experimental design to ascertain the changed in students’ behaviour towards energy conservation after underwent the intervention. The study involved 48 secondary school students in a Malaysian public school. ANOVA Repeated Measure technique was employed in order to compare scores of students’ behaviour towards energy conservation before and after the intervention. Based on the finding, students’ behaviour towards energy conservation improved after the intervention.

  20. Optimal covariate designs theory and applications

    CERN Document Server

    Das, Premadhis; Mandal, Nripes Kumar; Sinha, Bikas Kumar

    2015-01-01

    This book primarily addresses the optimality aspects of covariate designs. A covariate model is a combination of ANOVA and regression models. Optimal estimation of the parameters of the model using a suitable choice of designs is of great importance; as such choices allow experimenters to extract maximum information for the unknown model parameters. The main emphasis of this monograph is to start with an assumed covariate model in combination with some standard ANOVA set-ups such as CRD, RBD, BIBD, GDD, BTIBD, BPEBD, cross-over, multi-factor, split-plot and strip-plot designs, treatment control designs, etc. and discuss the nature and availability of optimal covariate designs. In some situations, optimal estimations of both ANOVA and the regression parameters are provided. Global optimality and D-optimality criteria are mainly used in selecting the design. The standard optimality results of both discrete and continuous set-ups have been adapted, and several novel combinatorial techniques have been applied for...

  1. The impact of sample non-normality on ANOVA and alternative methods.

    Science.gov (United States)

    Lantz, Björn

    2013-05-01

    In this journal, Zimmerman (2004, 2011) has discussed preliminary tests that researchers often use to choose an appropriate method for comparing locations when the assumption of normality is doubtful. The conceptual problem with this approach is that such a two-stage process makes both the power and the significance of the entire procedure uncertain, as type I and type II errors are possible at both stages. A type I error at the first stage, for example, will obviously increase the probability of a type II error at the second stage. Based on the idea of Schmider et al. (2010), which proposes that simulated sets of sample data be ranked with respect to their degree of normality, this paper investigates the relationship between population non-normality and sample non-normality with respect to the performance of the ANOVA, Brown-Forsythe test, Welch test, and Kruskal-Wallis test when used with different distributions, sample sizes, and effect sizes. The overall conclusion is that the Kruskal-Wallis test is considerably less sensitive to the degree of sample normality when populations are distinctly non-normal and should therefore be the primary tool used to compare locations when it is known that populations are not at least approximately normal. © 2012 The British Psychological Society.

  2. Optimization of Parameters for Manufacture Nanopowder Bioceramics at Machine Pulverisette 6 by Taguchi and ANOVA Method

    Science.gov (United States)

    Van Hoten, Hendri; Gunawarman; Mulyadi, Ismet Hari; Kurniawan Mainil, Afdhal; Putra, Bismantoloa dan

    2018-02-01

    This research is about manufacture nanopowder Bioceramics from local materials used Ball Milling for biomedical applications. Source materials for the manufacture of medicines are plants, animal tissues, microbial structures and engineering biomaterial. The form of raw material medicines is a powder before mixed. In the case of medicines, research is to find sources of biomedical materials that will be in the nanoscale powders can be used as raw material for medicine. One of the biomedical materials that can be used as raw material for medicine is of the type of bioceramics is chicken eggshells. This research will develop methods for manufacture nanopowder material from chicken eggshells with Ball Milling using the Taguchi method and ANOVA. Eggshell milled using a variation of Milling rate on 150, 200 and 250 rpm, the time variation of 1, 2 and 3 hours and variations the grinding balls to eggshell powder weight ratio (BPR) 1: 6, 1: 8, 1: 10. Before milled with Ball Milling crushed eggshells in advance and calcinate to a temperature of 900°C. After the milled material characterization of the fine powder of eggshell using SEM to see its size. The result of this research is optimum parameter of Taguchi Design analysis that is 250 rpm milling rate, 3 hours milling time and BPR is 1: 6 with the average eggshell powder size is 1.305 μm. Milling speed, milling time and ball to powder weight of ratio have contribution successively equal to 60.82%, 30.76% and 6.64% by error equal to 1.78%.

  3. Application of regression model on stream water quality parameters

    International Nuclear Information System (INIS)

    Suleman, M.; Maqbool, F.; Malik, A.H.; Bhatti, Z.A.

    2012-01-01

    Statistical analysis was conducted to evaluate the effect of solid waste leachate from the open solid waste dumping site of Salhad on the stream water quality. Five sites were selected along the stream. Two sites were selected prior to mixing of leachate with the surface water. One was of leachate and other two sites were affected with leachate. Samples were analyzed for pH, water temperature, electrical conductivity (EC), total dissolved solids (TDS), Biological oxygen demand (BOD), chemical oxygen demand (COD), dissolved oxygen (DO) and total bacterial load (TBL). In this study correlation coefficient r among different water quality parameters of various sites were calculated by using Pearson model and then average of each correlation between two parameters were also calculated, which shows TDS and EC and pH and BOD have significantly increasing r value, while temperature and TDS, temp and EC, DO and BL, DO and COD have decreasing r value. Single factor ANOVA at 5% level of significance was used which shows EC, TDS, TCL and COD were significantly differ among various sites. By the application of these two statistical approaches TDS and EC shows strongly positive correlation because the ions from the dissolved solids in water influence the ability of that water to conduct an electrical current. These two parameters significantly vary among 5 sites which are further confirmed by using linear regression. (author)

  4. Absolute variation of the mechanical characteristics of halloysite reinforced polyurethane nanocomposites complemented by Taguchi and ANOVA approaches

    Science.gov (United States)

    Gaaz, Tayser Sumer; Sulong, Abu Bakar; Kadhum, Abdul Amir H.; Nassir, Mohamed H.; Al-Amiery, Ahmed A.

    The variation of the results of the mechanical properties of halloysite nanotubes (HNTs) reinforced thermoplastic polyurethane (TPU) at different HNTs loadings was implemented as a tool for analysis. The preparation of HNTs-TPU nanocomposites was performed under four controlled parameters of mixing temperature, mixing speed, mixing time, and HNTs loading at three levels each to satisfy Taguchi method orthogonal array L9 aiming to optimize these parameters for the best measurements of tensile strength, Young's modulus, and tensile strain (known as responses). The maximum variation of the experimental results for each response was determined and analysed based on the optimized results predicted by Taguchi method and ANOVA. It was found that the maximum absolute variations of the three mentioned responses are 69%, 352%, and 126%, respectively. The analysis has shown that the preparation of the optimized tensile strength requires 1 wt.% HNTs loading (excluding 2 wt.% and 3 wt.%), mixing temperature of 190 °C (excluding 200 °C and 210 °C), and mixing speed of 30 rpm (excluding 40 rpm and 50 rpm). In addition, the analysis has determined that the mixing time at 20 min has no effect on the preparation. The mentioned analysis was fortified by ANOVA, images of FESEM, and DSC results. Seemingly, the agglomeration and distribution of HNTs in the nanocomposite play an important role in the process. The outcome of the analysis could be considered as a very important step towards the reliability of Taguchi method.

  5. Mathematical modeling with multidisciplinary applications

    CERN Document Server

    Yang, Xin-She

    2013-01-01

    Features mathematical modeling techniques and real-world processes with applications in diverse fields Mathematical Modeling with Multidisciplinary Applications details the interdisciplinary nature of mathematical modeling and numerical algorithms. The book combines a variety of applications from diverse fields to illustrate how the methods can be used to model physical processes, design new products, find solutions to challenging problems, and increase competitiveness in international markets. Written by leading scholars and international experts in the field, the

  6. Absolute variation of the mechanical characteristics of halloysite reinforced polyurethane nanocomposites complemented by Taguchi and ANOVA approaches

    Directory of Open Access Journals (Sweden)

    Tayser Sumer Gaaz

    Full Text Available The variation of the results of the mechanical properties of halloysite nanotubes (HNTs reinforced thermoplastic polyurethane (TPU at different HNTs loadings was implemented as a tool for analysis. The preparation of HNTs-TPU nanocomposites was performed under four controlled parameters of mixing temperature, mixing speed, mixing time, and HNTs loading at three levels each to satisfy Taguchi method orthogonal array L9 aiming to optimize these parameters for the best measurements of tensile strength, Young’s modulus, and tensile strain (known as responses. The maximum variation of the experimental results for each response was determined and analysed based on the optimized results predicted by Taguchi method and ANOVA. It was found that the maximum absolute variations of the three mentioned responses are 69%, 352%, and 126%, respectively. The analysis has shown that the preparation of the optimized tensile strength requires 1 wt.% HNTs loading (excluding 2 wt.% and 3 wt.%, mixing temperature of 190 °C (excluding 200 °C and 210 °C, and mixing speed of 30 rpm (excluding 40 rpm and 50 rpm. In addition, the analysis has determined that the mixing time at 20 min has no effect on the preparation. The mentioned analysis was fortified by ANOVA, images of FESEM, and DSC results. Seemingly, the agglomeration and distribution of HNTs in the nanocomposite play an important role in the process. The outcome of the analysis could be considered as a very important step towards the reliability of Taguchi method. Keywords: Nanocomposite, Design-of-experiment, Taguchi optimization method, Mechanical properties

  7. Finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Features step-by-step examples based on actual data and connects fundamental mathematical modeling skills and decision making concepts to everyday applicability Featuring key linear programming, matrix, and probability concepts, Finite Mathematics: Models and Applications emphasizes cross-disciplinary applications that relate mathematics to everyday life. The book provides a unique combination of practical mathematical applications to illustrate the wide use of mathematics in fields ranging from business, economics, finance, management, operations research, and the life and social sciences.

  8. Extensions and applications of degradation modeling

    International Nuclear Information System (INIS)

    Hsu, F.; Subudhi, M.; Samanta, P.K.; Vesely, W.E.

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, the authors discuss some of the extensions and applications of degradation modeling. The extensions and applications of the degradation modeling approaches discussed are: (a) theoretical developments to study reliability effects of different maintenance strategies and policies, (b) relating aging-failure rate to degradation rate, and (c) application to a continuously operating component

  9. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    to be applicable in organisations assigning a high importance to one or more factors that are known to be impacted by RD, while also experiencing a high level of occurrence of this factor. The RDAM supplements existing maturity models and metrics to provide a comprehensive set of data to support management......This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities...

  10. HTGR Application Economic Model Users' Manual

    International Nuclear Information System (INIS)

    Gandrik, A.M.

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  11. Engine Modelling for Control Applications

    DEFF Research Database (Denmark)

    Hendricks, Elbert

    1997-01-01

    In earlier work published by the author and co-authors, a dynamic engine model called a Mean Value Engine Model (MVEM) was developed. This model is physically based and is intended mainly for control applications. In its newer form, it is easy to fit to many different engines and requires little...... engine data for this purpose. It is especially well suited to embedded model applications in engine controllers, such as nonlinear observer based air/fuel ratio and advanced idle speed control. After a brief review of this model, it will be compared with other similar models which can be found...

  12. Applications and extensions of degradation modeling

    International Nuclear Information System (INIS)

    Hsu, F.; Subudhi, M.; Samanta, P.K.; Vesely, W.E.

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs

  13. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. [Brookhaven National Lab., Upton, NY (United States); Vesely, W.E. [Science Applications International Corp., Columbus, OH (United States)

    1991-12-31

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  14. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. (Brookhaven National Lab., Upton, NY (United States)); Vesely, W.E. (Science Applications International Corp., Columbus, OH (United States))

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  15. Multilevel models applications using SAS

    CERN Document Server

    Wang, Jichuan; Fisher, James F

    2011-01-01

    This book covers a broad range of topics about multilevel modeling. The goal is to help readers to understand the basic concepts, theoretical frameworks, and application methods of multilevel modeling. It is at a level also accessible to non-mathematicians, focusing on the methods and applications of various multilevel models and using the widely used statistical software SAS®. Examples are drawn from analysis of real-world research data.

  16. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed...

  17. The use of the barbell cluster ANOVA design for the assessment of Environmental Pollution (1987): a case study, Wigierski National Park, NE Poland

    Energy Technology Data Exchange (ETDEWEB)

    Migaszewski, Zdzislaw M. [Pedagogical University, Institute of Chemistry, Geochemistry and the Environment Div., ul. Checinska 5, 25-020 Kielce (Poland)]. E-mail: zmig@pu.kielce.pl; Galuszka, Agnieszka [Pedagogical University, Institute of Chemistry, Geochemistry and the Environment Div., ul. Checinska 5, 25-020 Kielce (Poland); Paslaski, Piotr [Central Chemical Laboratory of the Polish Geological Institute, ul. Rakowiecka 4, 00-975 Warsaw (Poland)

    2005-01-01

    This report presents an assessment of chemical variability in natural ecosystems of Wigierski National Park (NE Poland) derived from the calculation of geochemical baselines using a barbell cluster ANOVA design. This method enabled us to obtain statistically valid information with a minimum number of samples collected. Results of summary statistics are presented for elemental concentrations in the soil horizons-O (Ol + Ofh), -A and -B, 1- and 2-year old Pinus sylvestris L. (Scots pine) needles, pine bark and Hypogymnia physodes (L.) Nyl. (lichen) thalli, as well as pH and TOC. The scope of this study also encompassed S and C stable isotope determinations and SEM examinations on Scots pine needles. The variability for S and trace metals in soils and plant bioindicators is primarily governed by parent material lithology and to a lesser extent by anthropogenic factors. This fact enabled us to study concentrations that are close to regional background levels. - The barbell cluster ANOVA design allowed the number of samples collected to be reduced to a minimum.

  18. How Participatory Should Environmental Governance Be? Testing the Applicability of the Vroom-Yetton-Jago Model in Public Environmental Decision-Making

    Science.gov (United States)

    Lührs, Nikolas; Jager, Nicolas W.; Challies, Edward; Newig, Jens

    2018-02-01

    Public participation is potentially useful to improve public environmental decision-making and management processes. In corporate management, the Vroom-Yetton-Jago normative decision-making model has served as a tool to help managers choose appropriate degrees of subordinate participation for effective decision-making given varying decision-making contexts. But does the model recommend participatory mechanisms that would actually benefit environmental management? This study empirically tests the improved Vroom-Jago version of the model in the public environmental decision-making context. To this end, the key variables of the Vroom-Jago model are operationalized and adapted to a public environmental governance context. The model is tested using data from a meta-analysis of 241 published cases of public environmental decision-making, yielding three main sets of findings: (1) The Vroom-Jago model proves limited in its applicability to public environmental governance due to limited variance in its recommendations. We show that adjustments to key model equations make it more likely to produce meaningful recommendations. (2) We find that in most of the studied cases, public environmental managers (implicitly) employ levels of participation close to those that would have been recommended by the model. (3) An ANOVA revealed that such cases, which conform to model recommendations, generally perform better on stakeholder acceptance and environmental standards of outputs than those that diverge from the model. Public environmental management thus benefits from carefully selected and context-sensitive modes of participation.

  19. HTGR Application Economic Model Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  20. Markov chains models, algorithms and applications

    CERN Document Server

    Ching, Wai-Ki; Ng, Michael K; Siu, Tak-Kuen

    2013-01-01

    This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data.This book consists of eight chapters.  Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods

  1. Investigation of flood pattern using ANOVA statistic and remote sensing in Malaysia

    International Nuclear Information System (INIS)

    Ya'acob, Norsuzila; Ismail, Nor Syazwani; Mustafa, Norfazira; Yusof, Azita Laily

    2014-01-01

    Flood is an overflow or inundation that comes from river or other body of water and causes or threatens damages. In Malaysia, there are no formal categorization of flood but often broadly categorized as monsoonal, flash or tidal floods. This project will be focus on flood causes by monsoon. For the last few years, the number of extreme flood was occurred and brings great economic impact. The extreme weather pattern is the main sector contributes for this phenomenon. In 2010, several districts in the states of Kedah neighbour-hoods state have been hit by floods and it is caused by tremendous weather pattern. During this tragedy, the ratio of the rainfalls volume was not fixed for every region, and the flood happened when the amount of water increase rapidly and start to overflow. This is the main objective why this project has been carried out, and the analysis data has been done from August until October in 2010. The investigation was done to find the possibility correlation pattern parameters related to the flood. ANOVA statistic was used to calculate the percentage of parameters was involved and Regression and correlation calculate the strength of coefficient among parameters related to the flood while remote sensing image was used for validation between the calculation accuracy. According to the results, the prediction is successful as the coefficient of relation in flood event is 0.912 and proved by Terra-SAR image on 4th November 2010. The rates of change in weather pattern give the impact to the flood

  2. Structural equation modeling methods and applications

    CERN Document Server

    Wang, Jichuan

    2012-01-01

    A reference guide for applications of SEM using Mplus Structural Equation Modeling: Applications Using Mplus is intended as both a teaching resource and a reference guide. Written in non-mathematical terms, this book focuses on the conceptual and practical aspects of Structural Equation Modeling (SEM). Basic concepts and examples of various SEM models are demonstrated along with recently developed advanced methods, such as mixture modeling and model-based power analysis and sample size estimate for SEM. The statistical modeling program, Mplus, is also featured and provides researchers with a

  3. Acceptance of health information technology in health professionals: an application of the revised technology acceptance model.

    Science.gov (United States)

    Ketikidis, Panayiotis; Dimitrovski, Tomislav; Lazuras, Lambros; Bath, Peter A

    2012-06-01

    The response of health professionals to the use of health information technology (HIT) is an important research topic that can partly explain the success or failure of any HIT application. The present study applied a modified version of the revised technology acceptance model (TAM) to assess the relevant beliefs and acceptance of HIT systems in a sample of health professionals (n = 133). Structured anonymous questionnaires were used and a cross-sectional design was employed. The main outcome measure was the intention to use HIT systems. ANOVA was employed to examine differences in TAM-related variables between nurses and medical doctors, and no significant differences were found. Multiple linear regression analysis was used to assess the predictors of HIT usage intentions. The findings showed that perceived ease of use, but not usefulness, relevance and subjective norms directly predicted HIT usage intentions. The present findings suggest that a modification of the original TAM approach is needed to better understand health professionals' support and endorsement of HIT. Perceived ease of use, relevance of HIT to the medical and nursing professions, as well as social influences, should be tapped by information campaigns aiming to enhance support for HIT in healthcare settings.

  4. Human mobility: Models and applications

    Science.gov (United States)

    Barbosa, Hugo; Barthelemy, Marc; Ghoshal, Gourab; James, Charlotte R.; Lenormand, Maxime; Louail, Thomas; Menezes, Ronaldo; Ramasco, José J.; Simini, Filippo; Tomasini, Marcello

    2018-03-01

    Recent years have witnessed an explosion of extensive geolocated datasets related to human movement, enabling scientists to quantitatively study individual and collective mobility patterns, and to generate models that can capture and reproduce the spatiotemporal structures and regularities in human trajectories. The study of human mobility is especially important for applications such as estimating migratory flows, traffic forecasting, urban planning, and epidemic modeling. In this survey, we review the approaches developed to reproduce various mobility patterns, with the main focus on recent developments. This review can be used both as an introduction to the fundamental modeling principles of human mobility, and as a collection of technical methods applicable to specific mobility-related problems. The review organizes the subject by differentiating between individual and population mobility and also between short-range and long-range mobility. Throughout the text the description of the theory is intertwined with real-world applications.

  5. Integration of design applications with building models

    DEFF Research Database (Denmark)

    Eastman, C. M.; Jeng, T. S.; Chowdbury, R.

    1997-01-01

    This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

  6. Chemistry Teachers' Knowledge and Application of Models

    Science.gov (United States)

    Wang, Zuhao; Chi, Shaohui; Hu, Kaiyan; Chen, Wenting

    2014-01-01

    Teachers' knowledge and application of model play an important role in students' development of modeling ability and scientific literacy. In this study, we investigated Chinese chemistry teachers' knowledge and application of models. Data were collected through test questionnaire and analyzed quantitatively and qualitatively. The result indicated…

  7. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    Science.gov (United States)

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  8. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  9. Markov and mixed models with applications

    DEFF Research Database (Denmark)

    Mortensen, Stig Bousgaard

    This thesis deals with mathematical and statistical models with focus on applications in pharmacokinetic and pharmacodynamic (PK/PD) modelling. These models are today an important aspect of the drug development in the pharmaceutical industry and continued research in statistical methodology within...... or uncontrollable factors in an individual. Modelling using SDEs also provides new tools for estimation of unknown inputs to a system and is illustrated with an application to estimation of insulin secretion rates in diabetic patients. Models for the eect of a drug is a broader area since drugs may affect...... for non-parametric estimation of Markov processes are proposed to give a detailed description of the sleep process during the night. Statistically the Markov models considered for sleep states are closely related to the PK models based on SDEs as both models share the Markov property. When the models...

  10. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  11. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  12. Insertion Modeling and Its Applications

    OpenAIRE

    Alexander Letichevsky; Oleksandr Letychevskyi; Vladimir Peschanenko

    2016-01-01

    The paper relates to the theoretical and practical aspects of insertion modeling. Insertion modeling is a theory of agents and environments interaction where an environment is considered as agent with a special insertion function. The main notions of insertion modeling are presented. Insertion Modeling System is described as a tool for development of different kinds of insertion machines. The research and industrial applications of Insertion Modeling System are presented.

  13. Creep analysis of silicone for podiatry applications.

    Science.gov (United States)

    Janeiro-Arocas, Julia; Tarrío-Saavedra, Javier; López-Beceiro, Jorge; Naya, Salvador; López-Canosa, Adrián; Heredia-García, Nicolás; Artiaga, Ramón

    2016-10-01

    This work shows an effective methodology to characterize the creep-recovery behavior of silicones before their application in podiatry. The aim is to characterize, model and compare the creep-recovery properties of different types of silicone used in podiatry orthotics. Creep-recovery phenomena of silicones used in podiatry orthotics is characterized by dynamic mechanical analysis (DMA). Silicones provided by Herbitas are compared by observing their viscoelastic properties by Functional Data Analysis (FDA) and nonlinear regression. The relationship between strain and time is modeled by fixed and mixed effects nonlinear regression to compare easily and intuitively podiatry silicones. Functional ANOVA and Kohlrausch-Willians-Watts (KWW) model with fixed and mixed effects allows us to compare different silicones observing the values of fitting parameters and their physical meaning. The differences between silicones are related to the variations of breadth of creep-recovery time distribution and instantaneous deformation-permanent strain. Nevertheless, the mean creep-relaxation time is the same for all the studied silicones. Silicones used in palliative orthoses have higher instantaneous deformation-permanent strain and narrower creep-recovery distribution. The proposed methodology based on DMA, FDA and nonlinear regression is an useful tool to characterize and choose the proper silicone for each podiatry application according to their viscoelastic properties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Dimensions for hearing-impaired mobile application usability model

    Science.gov (United States)

    Nathan, Shelena Soosay; Hussain, Azham; Hashim, Nor Laily; Omar, Mohd Adan

    2017-10-01

    This paper discuss on the dimensions that has been derived for the hearing-impaired mobile applications usability model. General usability model consist of general dimension for evaluating mobile application however requirements for the hearing-impaired are overlooked and often scanted. This led towards mobile application developed for the hearing-impaired are left unused. It is also apparent that these usability models do not consider accessibility dimensions according to the requirement of the special users. This complicates the work of usability practitioners as well as academician that practices research usability when application are developed for the specific user needs. To overcome this issue, dimension chosen for the hearing-impaired are ensured to be align with the real need of the hearing-impaired mobile application. Besides literature studies, requirements for the hearing-impaired mobile application have been identified through interview conducted with hearing-impaired mobile application users that were recorded as video outputs and analyzed using Nvivo. Finally total of 6 out of 15 dimensions gathered are chosen for the proposed model and presented.

  15. A Classification of PLC Models and Applications

    NARCIS (Netherlands)

    Mader, Angelika H.; Boel, R.; Stremersch, G.

    In the past years there is an increasing interest in analysing PLC applications with formal methods. The first step to this end is to get formal models of PLC applications. Meanwhile, various models for PLCs have already been introduced in the literature. In our paper we discuss several

  16. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  17. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  18. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  19. Crop modeling applications in agricultural water management

    Science.gov (United States)

    Kisekka, Isaya; DeJonge, Kendall C.; Ma, Liwang; Paz, Joel; Douglas-Mankin, Kyle R.

    2017-01-01

    This article introduces the fourteen articles that comprise the “Crop Modeling and Decision Support for Optimizing Use of Limited Water” collection. This collection was developed from a special session on crop modeling applications in agricultural water management held at the 2016 ASABE Annual International Meeting (AIM) in Orlando, Florida. In addition, other authors who were not able to attend the 2016 ASABE AIM were also invited to submit papers. The articles summarized in this introductory article demonstrate a wide array of applications in which crop models can be used to optimize agricultural water management. The following section titles indicate the topics covered in this collection: (1) evapotranspiration modeling (one article), (2) model development and parameterization (two articles), (3) application of crop models for irrigation scheduling (five articles), (4) coordinated water and nutrient management (one article), (5) soil water management (two articles), (6) risk assessment of water-limited irrigation management (one article), and (7) regional assessments of climate impact (two articles). Changing weather and climate, increasing population, and groundwater depletion will continue to stimulate innovations in agricultural water management, and crop models will play an important role in helping to optimize water use in agriculture.

  20. DSC, FT-IR, NIR, NIR-PCA and NIR-ANOVA for determination of chemical stability of diuretic drugs: impact of excipients

    Directory of Open Access Journals (Sweden)

    Gumieniczek Anna

    2018-03-01

    Full Text Available It is well known that drugs can directly react with excipients. In addition, excipients can be a source of impurities that either directly react with drugs or catalyze their degradation. Thus, binary mixtures of three diuretics, torasemide, furosemide and amiloride with different excipients, i.e. citric acid anhydrous, povidone K25 (PVP, magnesium stearate (Mg stearate, lactose, D-mannitol, glycine, calcium hydrogen phosphate anhydrous (CaHPO4 and starch, were examined to detect interactions. High temperature and humidity or UV/VIS irradiation were applied as stressing conditions. Differential scanning calorimetry (DSC, FT-IR and NIR were used to adequately collect information. In addition, chemometric assessments of NIR signals with principal component analysis (PCA and ANOVA were applied.

  1. Mathematical Ship Modeling for Control Applications

    DEFF Research Database (Denmark)

    Perez, Tristan; Blanke, Mogens

    2002-01-01

    In this report, we review the models for describing the motion of a ship in four degrees of freedom suitable for control applications. We present the hydrodynamic models of two ships: a container and a multi-role naval vessel. The models are based on experimental results in the four degrees...

  2. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  3. Computer-aided modelling template: Concept and application

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2015-01-01

    decomposition technique which identifies generic steps and workflow involved, the computer-aided template concept has been developed. This concept is implemented as a software tool, which provides a user-friendly interface for following the workflow steps and guidance through the steps providing additional......Modelling is an important enabling technology in modern chemical engineering applications. A template-based approach is presented in this work to facilitate the construction and documentation of the models and enable their maintenance for reuse in a wider application range. Based on a model...

  4. Modeling Answer Change Behavior: An Application of a Generalized Item Response Tree Model

    Science.gov (United States)

    Jeon, Minjeong; De Boeck, Paul; van der Linden, Wim

    2017-01-01

    We present a novel application of a generalized item response tree model to investigate test takers' answer change behavior. The model allows us to simultaneously model the observed patterns of the initial and final responses after an answer change as a function of a set of latent traits and item parameters. The proposed application is illustrated…

  5. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  6. lmerTest Package: Tests in Linear Mixed Effects Models

    DEFF Research Database (Denmark)

    Kuznetsova, Alexandra; Brockhoff, Per B.; Christensen, Rune Haubo Bojesen

    2017-01-01

    One of the frequent questions by users of the mixed model function lmer of the lme4 package has been: How can I get p values for the F and t tests for objects returned by lmer? The lmerTest package extends the 'lmerMod' class of the lme4 package, by overloading the anova and summary functions...... by providing p values for tests for fixed effects. We have implemented the Satterthwaite's method for approximating degrees of freedom for the t and F tests. We have also implemented the construction of Type I - III ANOVA tables. Furthermore, one may also obtain the summary as well as the anova table using...

  7. Mobile Application Identification based on Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Yang Xinyan

    2018-01-01

    Full Text Available With the increasing number of mobile applications, there has more challenging network management tasks to resolve. Users also face security issues of the mobile Internet application when enjoying the mobile network resources. Identifying applications that correspond to network traffic can help network operators effectively perform network management. The existing mobile application recognition technology presents new challenges in extensibility and applications with encryption protocols. For the existing mobile application recognition technology, there are two problems, they can not recognize the application which using the encryption protocol and their scalability is poor. In this paper, a mobile application identification method based on Hidden Markov Model(HMM is proposed to extract the defined statistical characteristics from different network flows generated when each application starting. According to the time information of different network flows to get the corresponding time series, and then for each application to be identified separately to establish the corresponding HMM model. Then, we use 10 common applications to test the method proposed in this paper. The test results show that the mobile application recognition method proposed in this paper has a high accuracy and good generalization ability.

  8. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  9. Handbook of mixed membership models and their applications

    CERN Document Server

    Airoldi, Edoardo M; Erosheva, Elena A; Fienberg, Stephen E

    2014-01-01

    In response to scientific needs for more diverse and structured explanations of statistical data, researchers have discovered how to model individual data points as belonging to multiple groups. Handbook of Mixed Membership Models and Their Applications shows you how to use these flexible modeling tools to uncover hidden patterns in modern high-dimensional multivariate data. It explores the use of the models in various application settings, including survey data, population genetics, text analysis, image processing and annotation, and molecular biology.Through examples using real data sets, yo

  10. The MVP Model: Overview and Application

    Science.gov (United States)

    Keller, John M.

    2017-01-01

    This chapter contains an overview of the MVP model that is used as a basis for the other chapters in this issue. It also contains a description of key steps in the ARCS-V design process that is derived from the MVP model and a summary of a design-based research study illustrating the application of the ARCS-V model.

  11. Building adaptable and reusable XML applications with model transformations

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas

    2005-01-01

    We present an approach in which the semantics of an XML language is defined by means of a transformation from an XML document model (an XML schema) to an application specific model. The application specific model implements the intended behavior of documents written in the language. A transformation

  12. Optimizing a gap conductance model applicable to VVER-1000 thermal–hydraulic model

    International Nuclear Information System (INIS)

    Rahgoshay, M.; Hashemi-Tilehnoee, M.

    2012-01-01

    Highlights: ► Two known conductance models for application in VVER-1000 thermal–hydraulic code are examined. ► An optimized gap conductance model is developed which can predict the gap conductance in good agreement with FSAR data. ► The licensed thermal–hydraulic code is coupled with the gap conductance model predictor externally. -- Abstract: The modeling of gap conductance for application in VVER-1000 thermal–hydraulic codes is addressed. Two known models, namely CALZA-BINI and RELAP5 gap conductance models, are examined. By externally linking of gap conductance models and COBRA-EN thermal hydraulic code, the acceptable range of each model is specified. The result of each gap conductance model versus linear heat rate has been compared with FSAR data. A linear heat rate of about 9 kW/m is the boundary for optimization process. Since each gap conductance model has its advantages and limitation, the optimized gap conductance model can predict the gap conductance better than each of the two other models individually.

  13. Application of Improved Radiation Modeling to General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Michael J Iacono

    2011-04-07

    This research has accomplished its primary objectives of developing accurate and efficient radiation codes, validating them with measurements and higher resolution models, and providing these advancements to the global modeling community to enhance the treatment of cloud and radiative processes in weather and climate prediction models. A critical component of this research has been the development of the longwave and shortwave broadband radiative transfer code for general circulation model (GCM) applications, RRTMG, which is based on the single-column reference code, RRTM, also developed at AER. RRTMG is a rigorously tested radiation model that retains a considerable level of accuracy relative to higher resolution models and measurements despite the performance enhancements that have made it possible to apply this radiation code successfully to global dynamical models. This model includes the radiative effects of all significant atmospheric gases, and it treats the absorption and scattering from liquid and ice clouds and aerosols. RRTMG also includes a statistical technique for representing small-scale cloud variability, such as cloud fraction and the vertical overlap of clouds, which has been shown to improve cloud radiative forcing in global models. This development approach has provided a direct link from observations to the enhanced radiative transfer provided by RRTMG for application to GCMs. Recent comparison of existing climate model radiation codes with high resolution models has documented the improved radiative forcing capability provided by RRTMG, especially at the surface, relative to other GCM radiation models. Due to its high accuracy, its connection to observations, and its computational efficiency, RRTMG has been implemented operationally in many national and international dynamical models to provide validated radiative transfer for improving weather forecasts and enhancing the prediction of global climate change.

  14. Association models for petroleum applications

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios

    2013-01-01

    Thermodynamics plays an important role in many applications in the petroleum industry, both upstream and downstream, ranging from flow assurance, (enhanced) oil recovery and control of chemicals to meet production and environmental regulations. There are many different applications in the oil & gas...... industry, thus thermodynamic data (phase behaviour, densities, speed of sound, etc) are needed to study a very diverse range of compounds in addition to the petroleum ones (CO2, H2S, water, alcohols, glycols, mercaptans, mercury, asphaltenes, waxes, polymers, electrolytes, biofuels, etc) within a very....... Such association models have been, especially over the last 20 years, proved to be very successful in predicting many thermodynamic properties in the oil & gas industry. They have not so far replaced cubic equations of state, but the results obtained by using these models are very impressive in many cases, e...

  15. Stochastic biomathematical models with applications to neuronal modeling

    CERN Document Server

    Batzel, Jerry; Ditlevsen, Susanne

    2013-01-01

    Stochastic biomathematical models are becoming increasingly important as new light is shed on the role of noise in living systems. In certain biological systems, stochastic effects may even enhance a signal, thus providing a biological motivation for the noise observed in living systems. Recent advances in stochastic analysis and increasing computing power facilitate the analysis of more biophysically realistic models, and this book provides researchers in computational neuroscience and stochastic systems with an overview of recent developments. Key concepts are developed in chapters written by experts in their respective fields. Topics include: one-dimensional homogeneous diffusions and their boundary behavior, large deviation theory and its application in stochastic neurobiological models, a review of mathematical methods for stochastic neuronal integrate-and-fire models, stochastic partial differential equation models in neurobiology, and stochastic modeling of spreading cortical depression.

  16. Model-Driven Approach for Body Area Network Application Development.

    Science.gov (United States)

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-05-12

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  17. Com aplicar les proves paramètriques bivariades t de Student i ANOVA en SPSS. Cas pràctic

    Directory of Open Access Journals (Sweden)

    María-José Rubio-Hurtado

    2012-07-01

    Full Text Available Les proves paramètriques són un tipus de proves de significació estadística que quantifiquen l'associació o independència entre una variable quantitativa i una categòrica. Les proves paramètriques són exigents amb certs requisits previs per a la seva aplicació: la distribució Normal de la variable quantitativa en els grups que es comparen, l'homogeneïtat de variàncies en les poblacions de les quals procedeixen els grups i una n mostral no inferior a 30. El seu no compliment comporta la necessitat de recórrer a proves estadístiques no paramètriques. Les proves paramètriques es classifiquen en dos: prova t (per a una mostra o per a dues mostres relacionades o independents i prova ANOVA (per a més de dues mostres independents.

  18. Modeling Medical Services with Mobile Health Applications

    Directory of Open Access Journals (Sweden)

    Zhenfei Wang

    2018-01-01

    Full Text Available The rapid development of mobile health technology (m-Health provides unprecedented opportunities for improving health services. As the bridge between doctors and patients, mobile health applications enable patients to communicate with doctors through their smartphones, which is becoming more and more popular among people. To evaluate the influence of m-Health applications on the medical service market, we propose a medical service equilibrium model. The model can balance the supply of doctors and demand of patients and reflect possible options for both doctors and patients with or without m-Health applications in the medical service market. In the meantime, we analyze the behavior of patients and the activities of doctors to minimize patients’ full costs of healthcare and doctors’ futility. Then, we provide a resolution algorithm through mathematical reasoning. Lastly, based on artificially generated dataset, experiments are conducted to evaluate the medical services of m-Health applications.

  19. ANOVA parameters influence in LCF experimental data and simulation results

    Directory of Open Access Journals (Sweden)

    Vercelli A.

    2010-06-01

    Full Text Available The virtual design of components undergoing thermo mechanical fatigue (TMF and plastic strains is usually run in many phases. The numerical finite element method gives a useful instrument which becomes increasingly effective as the geometrical and numerical modelling gets more accurate. The constitutive model definition plays an important role in the effectiveness of the numerical simulation [1, 2] as, for example, shown in Figure 1. In this picture it is shown how a good cyclic plasticity constitutive model can simulate a cyclic load experiment. The component life estimation is the subsequent phase and it needs complex damage and life estimation models [3-5] which take into account of several parameters and phenomena contributing to damage and life duration. The calibration of these constitutive and damage models requires an accurate testing activity. In the present paper the main topic of the research activity is to investigate whether the parameters, which result to be influent in the experimental activity, influence the numerical simulations, thus defining the effectiveness of the models in taking into account of all the phenomena actually influencing the life of the component. To obtain this aim a procedure to tune the parameters needed to estimate the life of mechanical components undergoing TMF and plastic strains is presented for commercial steel. This procedure aims to be easy and to allow calibrating both material constitutive model (for the numerical structural simulation and the damage and life model (for life assessment. The procedure has been applied to specimens. The experimental activity has been developed on three sets of tests run at several temperatures: static tests, high cycle fatigue (HCF tests, low cycle fatigue (LCF tests. The numerical structural FEM simulations have been run on a commercial non linear solver, ABAQUS®6.8. The simulations replied the experimental tests. The stress, strain, thermal results from the thermo

  20. Deformation Models Tracking, Animation and Applications

    CERN Document Server

    Torres, Arnau; Gómez, Javier

    2013-01-01

    The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications.  The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, s...

  1. Models in Science Education: Applications of Models in Learning and Teaching Science

    Science.gov (United States)

    Ornek, Funda

    2008-01-01

    In this paper, I discuss different types of models in science education and applications of them in learning and teaching science, in particular physics. Based on the literature, I categorize models as conceptual and mental models according to their characteristics. In addition to these models, there is another model called "physics model" by the…

  2. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  3. A review of thermoelectric cooling: Materials, modeling and applications

    International Nuclear Information System (INIS)

    Zhao, Dongliang; Tan, Gang

    2014-01-01

    This study reviews the recent advances of thermoelectric materials, modeling approaches, and applications. Thermoelectric cooling systems have advantages over conventional cooling devices, including compact in size, light in weight, high reliability, no mechanical moving parts, no working fluid, being powered by direct current, and easily switching between cooling and heating modes. In this study, historical development of thermoelectric cooling has been briefly introduced first. Next, the development of thermoelectric materials has been given and the achievements in past decade have been summarized. To improve thermoelectric cooling system's performance, the modeling techniques have been described for both the thermoelement modeling and thermoelectric cooler (TEC) modeling including standard simplified energy equilibrium model, one-dimensional and three-dimensional models, and numerical compact model. Finally, the thermoelectric cooling applications have been reviewed in aspects of domestic refrigeration, electronic cooling, scientific application, and automobile air conditioning and seat temperature control, with summaries for the commercially available thermoelectric modules and thermoelectric refrigerators. It is expected that this study will be beneficial to thermoelectric cooling system design, simulation, and analysis. - Highlights: •Thermoelectric cooling has great prospects with thermoelectric material's advances. •Modeling techniques for both thermoelement and TEC have been reviewed. •Principle thermoelectric cooling applications have been reviewed and summarized

  4. Behavior Modeling -- Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes revised selected papers from the six International Workshops on Behavior Modelling - Foundations and Applications, BM-FA, which took place annually between 2009 and 2014. The 9 papers presented in this volume were carefully reviewed and selected from a total of 58 papers...

  5. Model-Driven Approach for Body Area Network Application Development

    Science.gov (United States)

    Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata

    2016-01-01

    This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394

  6. Model-Driven Approach for Body Area Network Application Development

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-05-01

    Full Text Available This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS. We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  7. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  8. Sparse Multivariate Modeling: Priors and Applications

    DEFF Research Database (Denmark)

    Henao, Ricardo

    This thesis presents a collection of statistical models that attempt to take advantage of every piece of prior knowledge available to provide the models with as much structure as possible. The main motivation for introducing these models is interpretability since in practice we want to be able...... a general yet self-contained description of every model in terms of generative assumptions, interpretability goals, probabilistic formulation and target applications. Case studies, benchmark results and practical details are also provided as appendices published elsewhere, containing reprints of peer...

  9. Technical note: Application of the Box-Cox data transformation to animal science experiments.

    Science.gov (United States)

    Peltier, M R; Wilcox, C J; Sharp, D C

    1998-03-01

    In the use of ANOVA for hypothesis testing in animal science experiments, the assumption of homogeneity of errors often is violated because of scale effects and the nature of the measurements. We demonstrate a method for transforming data so that the assumptions of ANOVA are met (or violated to a lesser degree) and apply it in analysis of data from a physiology experiment. Our study examined whether melatonin implantation would affect progesterone secretion in cycling pony mares. Overall treatment variances were greater in the melatonin-treated group, and several common transformation procedures failed. Application of the Box-Cox transformation algorithm reduced the heterogeneity of error and permitted the assumption of equal variance to be met.

  10. Structural Equation Modeling with Mplus Basic Concepts, Applications, and Programming

    CERN Document Server

    Byrne, Barbara M

    2011-01-01

    Modeled after Barbara Byrne's other best-selling structural equation modeling (SEM) books, this practical guide reviews the basic concepts and applications of SEM using Mplus Versions 5 & 6. The author reviews SEM applications based on actual data taken from her own research. Using non-mathematical language, it is written for the novice SEM user. With each application chapter, the author "walks" the reader through all steps involved in testing the SEM model including: an explanation of the issues addressed illustrated and annotated testing of the hypothesized and post hoc models expl

  11. Modelling for Bio-,Agro- and Pharma-Applications

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Singh, Ravendra; Cameron, Ian

    2011-01-01

    This chapter considers a range of modelling applications drawn from biological, agrochemical and pharma fields. Microcapsule controlled release of an active ingredient is considered through a time dependent model. Burst-time and lag-time effects are considered and the model adopts a multiscale...... of a milling process within pharmaceutical production as well as a dynamic model representing a fluidised granulation bed for pharma products. The final model considers the tablet pressing process....

  12. An investigation of modelling and design for software service applications.

    Science.gov (United States)

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  13. Model selection criteria : how to evaluate order restrictions

    NARCIS (Netherlands)

    Kuiper, R.M.

    2012-01-01

    Researchers often have ideas about the ordering of model parameters. They frequently have one or more theories about the ordering of the group means, in analysis of variance (ANOVA) models, or about the ordering of coefficients corresponding to the predictors, in regression models.A researcher might

  14. Application of postured human model for SAR measurements

    Science.gov (United States)

    Vuchkovikj, M.; Munteanu, I.; Weiland, T.

    2013-07-01

    In the last two decades, the increasing number of electronic devices used in day-to-day life led to a growing interest in the study of the electromagnetic field interaction with biological tissues. The design of medical devices and wireless communication devices such as mobile phones benefits a lot from the bio-electromagnetic simulations in which digital human models are used. The digital human models currently available have an upright position which limits the research activities in realistic scenarios, where postured human bodies must be considered. For this reason, a software application called "BodyFlex for CST STUDIO SUITE" was developed. In its current version, this application can deform the voxel-based human model named HUGO (Dipp GmbH, 2010) to allow the generation of common postures that people use in normal life, ensuring the continuity of tissues and conserving the mass to an acceptable level. This paper describes the enhancement of the "BodyFlex" application, which is related to the movements of the forearm and the wrist of a digital human model. One of the electromagnetic applications in which the forearm and the wrist movement of a voxel based human model has a significant meaning is the measurement of the specific absorption rate (SAR) when a model is exposed to a radio frequency electromagnetic field produced by a mobile phone. Current SAR measurements of the exposure from mobile phones are performed with the SAM (Specific Anthropomorphic Mannequin) phantom which is filled with a dispersive but homogeneous material. We are interested what happens with the SAR values if a realistic inhomogeneous human model is used. To this aim, two human models, a homogeneous and an inhomogeneous one, in two simulation scenarios are used, in order to examine and observe the differences in the results for the SAR values.

  15. Application Note: Power Grid Modeling With Xyce.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This application note describes how to model steady-state power flows and transient events in electric power grids with the SPICE-compatible Xyce TM Parallel Electronic Simulator developed at Sandia National Labs. This application notes provides a brief tutorial on the basic devices (branches, bus shunts, transformers and generators) found in power grids. The focus is on the features supported and assumptions made by the Xyce models for power grid elements. It then provides a detailed explanation, including working Xyce netlists, for simulating some simple power grid examples such as the IEEE 14-bus test case.

  16. Kinetic models and parameters estimation study of biomass and ...

    African Journals Online (AJOL)

    compaq

    2017-01-11

    Jan 11, 2017 ... Unstructured models were proposed using the logistic equation for growth, the ... analysis of variance (ANOVA) was also used to validate the proposed models. ... production but their choice depends on the cost and the.

  17. Artificial Immune Networks: Models and Applications

    Directory of Open Access Journals (Sweden)

    Xian Shen

    2008-06-01

    Full Text Available Artificial Immune Systems (AIS, which is inspired by the nature immune system, has been applied for solving complex computational problems in classification, pattern rec- ognition, and optimization. In this paper, the theory of the natural immune system is first briefly introduced. Next, we compare some well-known AIS and their applications. Several representative artificial immune networks models are also dis- cussed. Moreover, we demonstrate the applications of artificial immune networks in various engineering fields.

  18. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study.......Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  19. Beginning SQL Server Modeling Model-driven Application Development in SQL Server

    CERN Document Server

    Weller, Bart

    2010-01-01

    Get ready for model-driven application development with SQL Server Modeling! This book covers Microsoft's SQL Server Modeling (formerly known under the code name "Oslo") in detail and contains the information you need to be successful with designing and implementing workflow modeling. Beginning SQL Server Modeling will help you gain a comprehensive understanding of how to apply DSLs and other modeling components in the development of SQL Server implementations. Most importantly, after reading the book and working through the examples, you will have considerable experience using SQL M

  20. Some Issues of Biological Shape Modelling with Applications

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Skoglund, Karl

    2003-01-01

    This paper illustrates current research at Informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations to, modifications to, and applications of the elements of constructing models of shape or appearance...

  1. Development and application of air quality models at the US ...

    Science.gov (United States)

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. Handbook of EOQ inventory problems stochastic and deterministic models and applications

    CERN Document Server

    Choi, Tsan-Ming

    2013-01-01

    This book explores deterministic and stochastic EOQ-model based problems and applications, presenting technical analyses of single-echelon EOQ model based inventory problems, and applications of the EOQ model for multi-echelon supply chain inventory analysis.

  3. An overview of topic modeling and its current applications in bioinformatics.

    Science.gov (United States)

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  4. Applications of system dynamics modelling to support health policy.

    Science.gov (United States)

    Atkinson, Jo-An M; Wells, Robert; Page, Andrew; Dominello, Amanda; Haines, Mary; Wilson, Andrew

    2015-07-09

    The value of systems science modelling methods in the health sector is increasingly being recognised. Of particular promise is the potential of these methods to improve operational aspects of healthcare capacity and delivery, analyse policy options for health system reform and guide investments to address complex public health problems. Because it lends itself to a participatory approach, system dynamics modelling has been a particularly appealing method that aims to align stakeholder understanding of the underlying causes of a problem and achieve consensus for action. The aim of this review is to determine the effectiveness of system dynamics modelling for health policy, and explore the range and nature of its application. A systematic search was conducted to identify articles published up to April 2015 from the PubMed, Web of Knowledge, Embase, ScienceDirect and Google Scholar databases. The grey literature was also searched. Papers eligible for inclusion were those that described applications of system dynamics modelling to support health policy at any level of government. Six papers were identified, comprising eight case studies of the application of system dynamics modelling to support health policy. No analytic studies were found that examined the effectiveness of this type of modelling. Only three examples engaged multidisciplinary stakeholders in collective model building. Stakeholder participation in model building reportedly facilitated development of a common 'mental map' of the health problem, resulting in consensus about optimal policy strategy and garnering support for collaborative action. The paucity of relevant papers indicates that, although the volume of descriptive literature advocating the value of system dynamics modelling is considerable, its practical application to inform health policy making is yet to be routinely applied and rigorously evaluated. Advances in software are allowing the participatory model building approach to be extended to

  5. The influence of topical application of grapeseed extract gel on enamel surface hardness after demineralization

    Science.gov (United States)

    Saragih, D. A.; Herda, E.; Triaminingsih, S.

    2017-08-01

    The aim of this study was to analyze the influence of topical application of 6.5% and 12.5% grapeseed extract gels for duration of application 16 and 32 minutes on the enamel surface hardness following tooth demineralization by an energy drink. The samples were 21 bovine teeth that underwent demineralization by immersion in the energy drink for 5 minutes in an incubator at 37°C. The demineralized specimens were randomly divided into a control group and 2 treatment groups. The control group was immersed in artificial saliva for 6 hours at 37°C, whereas the treatment groups were treated with topical 6.5% and 12.5% grapeseed extract gels for durations of 16 and 32 minutes and then immersed in artificial saliva for 6 hours at 37°C. The hardness was measured with a Knoop hardness tester. Statistical analysis by repeated ANOVA and one-way ANOVA revealed a significant increase in the enamel hardness value (p0.05).

  6. Medical applications of model-based dynamic thermography

    Science.gov (United States)

    Nowakowski, Antoni; Kaczmarek, Mariusz; Ruminski, Jacek; Hryciuk, Marcin; Renkielska, Alicja; Grudzinski, Jacek; Siebert, Janusz; Jagielak, Dariusz; Rogowski, Jan; Roszak, Krzysztof; Stojek, Wojciech

    2001-03-01

    The proposal to use active thermography in medical diagnostics is promising in some applications concerning investigation of directly accessible parts of the human body. The combination of dynamic thermograms with thermal models of investigated structures gives attractive possibility to make internal structure reconstruction basing on different thermal properties of biological tissues. Measurements of temperature distribution synchronized with external light excitation allow registration of dynamic changes of local temperature dependent on heat exchange conditions. Preliminary results of active thermography applications in medicine are discussed. For skin and under- skin tissues an equivalent thermal model may be determined. For the assumed model its effective parameters may be reconstructed basing on the results of transient thermal processes. For known thermal diffusivity and conductivity of specific tissues the local thickness of a two or three layer structure may be calculated. Results of some medical cases as well as reference data of in vivo study on animals are presented. The method was also applied to evaluate the state of the human heart during the open chest cardio-surgical interventions. Reference studies of evoked heart infarct in pigs are referred, too. We see the proposed new in medical applications technique as a promising diagnostic tool. It is a fully non-invasive, clean, handy, fast and affordable method giving not only qualitative view of investigated surfaces but also an objective quantitative measurement result, accurate enough for many applications including fast screening of affected tissues.

  7. An investigation of modelling and design for software service applications

    Science.gov (United States)

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  8. Application of Bayesian Model Selection for Metal Yield Models using ALEGRA and Dakota.

    Energy Technology Data Exchange (ETDEWEB)

    Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James; Swiler, Laura Painton

    2018-02-01

    This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.

  9. Modelling and applications in mathematics education the 14th ICMI study

    CERN Document Server

    Galbraith, Peter L; Niss, Mogens

    2007-01-01

    The book aims at showing the state-of-the-art in the field of modeling and applications in mathematics education. This is the first volume to do this. The book deals with the question of how key competencies of applications and modeling at the heart of mathematical literacy may be developed; with the roles that applications and modeling may play in mathematics teaching, making mathematics more relevant for students.

  10. A primer on linear models

    CERN Document Server

    Monahan, John F

    2008-01-01

    Preface Examples of the General Linear Model Introduction One-Sample Problem Simple Linear Regression Multiple Regression One-Way ANOVA First Discussion The Two-Way Nested Model Two-Way Crossed Model Analysis of Covariance Autoregression Discussion The Linear Least Squares Problem The Normal Equations The Geometry of Least Squares Reparameterization Gram-Schmidt Orthonormalization Estimability and Least Squares Estimators Assumptions for the Linear Mean Model Confounding, Identifiability, and Estimability Estimability and Least Squares Estimators F

  11. Selected Tether Applications Cost Model

    Science.gov (United States)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  12. Modelling of Tape Casting for Ceramic Applications

    DEFF Research Database (Denmark)

    Jabbari, Masoud

    was increased by improving the steady state model with a quasi-steady state analytical model. In order to control the most important process parameter, tape thickness, the two-doctor blade configuration was also modeled analytically. The model was developed to control the tape thickness based on the machine...... for magnetic refrigeration applications. Numerical models were developed to track the migration of the particles inside the ceramic slurry. The results showed the presence of some areas inside the ceramic in which the concentration of the particles is higher compared to other parts, creating the resulting...

  13. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  14. Modelling Safe Interface Interactions in Web Applications

    Science.gov (United States)

    Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael

    Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.

  15. GSTARS computer models and their applications, Part II: Applications

    Science.gov (United States)

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  16. AUTOMOTIVE APPLICATIONS OF EVOLVING TAKAGI-SUGENO-KANG FUZZY MODELS

    Directory of Open Access Journals (Sweden)

    Radu-Emil Precup

    2017-08-01

    Full Text Available This paper presents theoretical and application results concerning the development of evolving Takagi-Sugeno-Kang fuzzy models for two dynamic systems, which will be viewed as controlled processes, in the field of automotive applications. The two dynamic systems models are nonlinear dynamics of the longitudinal slip in the Anti-lock Braking Systems (ABS and the vehicle speed in vehicles with the Continuously Variable Transmission (CVT systems. The evolving Takagi-Sugeno-Kang fuzzy models are obtained as discrete-time fuzzy models by incremental online identification algorithms. The fuzzy models are validated against experimental results in the case of the ABS and the first principles simulation results in the case of the vehicle with the CVT.

  17. Syntheses of the current model applications for managing water and needs for experimental data and model improvements to enhance these applications

    Science.gov (United States)

    This volume of the Advances in Agricultural Systems Modeling series presents 14 different case studies of model applications to help make the best use of limited water in agriculture. These examples show that models have tremendous potential and value in enhancing site-specific water management for ...

  18. [Watershed water environment pollution models and their applications: a review].

    Science.gov (United States)

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  19. Surface effects in solid mechanics models, simulations and applications

    CERN Document Server

    Altenbach, Holm

    2013-01-01

    This book reviews current understanding, and future trends, of surface effects in solid mechanics. Covers elasticity, plasticity and viscoelasticity, modeling based on continuum theories and molecular modeling and applications of different modeling approaches.

  20. Solutions manual to accompany finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    A solutions manual to accompany Finite Mathematics: Models and Applications In order to emphasize the main concepts of each chapter, Finite Mathematics: Models and Applications features plentiful pedagogical elements throughout such as special exercises, end notes, hints, select solutions, biographies of key mathematicians, boxed key principles, a glossary of important terms and topics, and an overview of use of technology. The book encourages the modeling of linear programs and their solutions and uses common computer software programs such as LINDO. In addition to extensive chapters on pr

  1. Multilevel modelling: Beyond the basic applications.

    Science.gov (United States)

    Wright, Daniel B; London, Kamala

    2009-05-01

    Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.

  2. Registry of EPA Applications, Models, and Databases

    Data.gov (United States)

    U.S. Environmental Protection Agency — READ is EPA's authoritative source for information about Agency information resources, including applications/systems, datasets and models. READ is one component of...

  3. Model oriented application generation for industrial control systems

    International Nuclear Information System (INIS)

    Copy, B.; Barillere, R.; Blanco, E.; Fernandez Adiego, B.; Nogueira Fernandes, R.; Prieto Barreiro, I.

    2012-01-01

    The CERN Unified Industrial Control Systems framework (UNICOS) is a software generation methodology and a collection of development tools that standardizes the design of industrial control applications. A Software Factory, named the UNICOS Application Builder (UAB), was introduced to ease extensibility and maintenance of the framework, introducing a stable meta-model, a set of platform-independent models and platform-specific configurations against which code generation plug-ins and configuration generation plug-ins can be written. Such plug-ins currently target PLC programming environments (Schneider and SIEMENS PLCs) as well as SIEMENS WinCC Open Architecture SCADA (previously known as ETM PVSS) but are being expanded to cover more and more aspects of process control systems. We present what constitutes the UNICOS meta-model and the models in use, how these models can be used to capture knowledge about industrial control systems and how this knowledge can be used to generate both code and configuration for a variety of target usages. (authors)

  4. Application of multidimensional IRT models to longitudinal data

    NARCIS (Netherlands)

    te Marvelde, J.M.; Glas, Cornelis A.W.; Van Landeghem, Georges; Van Damme, Jan

    2006-01-01

    The application of multidimensional item response theory (IRT) models to longitudinal educational surveys where students are repeatedly measured is discussed and exemplified. A marginal maximum likelihood (MML) method to estimate the parameters of a multidimensional generalized partial credit model

  5. A survey on the modeling and applications of cellular automata theory

    Science.gov (United States)

    Gong, Yimin

    2017-09-01

    The Cellular Automata Theory is a discrete model which is now widely used in scientific researches and simulations. The model is comprised of some cells which changes according to a specific rule over time. This paper provides a survey of the Modeling and Applications of Cellular Automata Theory, which focus on the program realization of Cellular Automata Theory and the application of Cellular Automata in each field, such as road traffic, land use, and cutting machines. Each application is further explained, and several related main models are briefly introduced. This research aims to help decision-makers formulate appropriate development plans.

  6. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  7. Application distribution model and related security attacks in VANET

    Science.gov (United States)

    Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian

    2013-03-01

    In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.

  8. Rater reliability and construct validity of a mobile application for posture analysis.

    Science.gov (United States)

    Szucs, Kimberly A; Brown, Elena V Donoso

    2018-01-01

    [Purpose] Measurement of posture is important for those with a clinical diagnosis as well as researchers aiming to understand the impact of faulty postures on the development of musculoskeletal disorders. A reliable, cost-effective and low tech posture measure may be beneficial for research and clinical applications. The purpose of this study was to determine rater reliability and construct validity of a posture screening mobile application in healthy young adults. [Subjects and Methods] Pictures of subjects were taken in three standing positions. Two raters independently digitized the static standing posture image twice. The app calculated posture variables, including sagittal and coronal plane translations and angulations. Intra- and inter-rater reliability were calculated using the appropriate ICC models for complete agreement. Construct validity was determined through comparison of known groups using repeated measures ANOVA. [Results] Intra-rater reliability ranged from 0.71 to 0.99. Inter-rater reliability was good to excellent for all translations. ICCs were stronger for translations versus angulations. The construct validity analysis found that the app was able to detect the change in the four variables selected. [Conclusion] The posture mobile application has demonstrated strong rater reliability and preliminary evidence of construct validity. This application may have utility in clinical and research settings.

  9. Ionospheric Modeling for Precise GNSS Applications

    NARCIS (Netherlands)

    Memarzadeh, Y.

    2009-01-01

    The main objective of this thesis is to develop a procedure for modeling and predicting ionospheric Total Electron Content (TEC) for high precision differential GNSS applications. As the ionosphere is a highly dynamic medium, we believe that to have a reliable procedure it is necessary to transfer

  10. Degenerate RFID Channel Modeling for Positioning Applications

    Directory of Open Access Journals (Sweden)

    A. Povalac

    2012-12-01

    Full Text Available This paper introduces the theory of channel modeling for positioning applications in UHF RFID. It explains basic parameters for channel characterization from both the narrowband and wideband point of view. More details are given about ranging and direction finding. Finally, several positioning scenarios are analyzed with developed channel models. All the described models use a degenerate channel, i.e. combined signal propagation from the transmitter to the tag and from the tag to the receiver.

  11. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  12. Application of autoregressive moving average model in reactor noise analysis

    International Nuclear Information System (INIS)

    Tran Dinh Tri

    1993-01-01

    The application of an autoregressive (AR) model to estimating noise measurements has achieved many successes in reactor noise analysis in the last ten years. The physical processes that take place in the nuclear reactor, however, are described by an autoregressive moving average (ARMA) model rather than by an AR model. Consequently more correct results could be obtained by applying the ARMA model instead of the AR model to reactor noise analysis. In this paper the system of the generalised Yule-Walker equations is derived from the equation of an ARMA model, then a method for its solution is given. Numerical results show the applications of the method proposed. (author)

  13. An Object-Oriented Information Model for Policy-based Management of Distributed Applications

    NARCIS (Netherlands)

    Diaz, G.; Gay, V.C.J.; Horlait, E.; Hamza, M.H.

    2002-01-01

    This paper presents an object-oriented information model to support a policy-based management for distributed multimedia applications. The information base contains application-level information about the users, the applications, and their profile. Our Information model is described in details and

  14. Brookhaven Regional Energy Facility Siting Model (REFS): model development and application

    Energy Technology Data Exchange (ETDEWEB)

    Meier, P.; Hobbs, B.; Ketcham, G.; McCoy, M.; Stern, R.

    1979-06-01

    A siting methodology developed specifically to bridge the gap between regional-energy-system scenarios and environmental transport models is documented. Development of the model is described in Chapter 1. Chapter 2 described the basic structure of such a model. Additional chapters on model development cover: generation, transmission, demand disaggregation, the interface to other models, computational aspects, the coal sector, water resource considerations, and air quality considerations. These subjects comprise Part I. Part II, Model Applications, covers: analysis of water resource constraints, water resource issues in the New York Power Pool, water resource issues in the New England Power Pool, water resource issues in the Pennsylvania-Jersey-Maryland Power Pool, and a summary of water resource constraint analysis. (MCW)

  15. Hydraulic modeling development and application in water resources engineering

    Science.gov (United States)

    Simoes, Francisco J.; Yang, Chih Ted; Wang, Lawrence K.

    2015-01-01

    The use of modeling has become widespread in water resources engineering and science to study rivers, lakes, estuaries, and coastal regions. For example, computer models are commonly used to forecast anthropogenic effects on the environment, and to help provide advanced mitigation measures against catastrophic events such as natural and dam-break floods. Linking hydraulic models to vegetation and habitat models has expanded their use in multidisciplinary applications to the riparian corridor. Implementation of these models in software packages on personal desktop computers has made them accessible to the general engineering community, and their use has been popularized by the need of minimal training due to intuitive graphical user interface front ends. Models are, however, complex and nontrivial, to the extent that even common terminology is sometimes ambiguous and often applied incorrectly. In fact, many efforts are currently under way in order to standardize terminology and offer guidelines for good practice, but none has yet reached unanimous acceptance. This chapter provides a view of the elements involved in modeling surface flows for the application in environmental water resources engineering. It presents the concepts and steps necessary for rational model development and use by starting with the exploration of the ideas involved in defining a model. Tangible form of those ideas is provided by the development of a mathematical and corresponding numerical hydraulic model, which is given with a substantial amount of detail. The issues of model deployment in a practical and productive work environment are also addressed. The chapter ends by presenting a few model applications highlighting the need for good quality control in model validation.

  16. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    International Nuclear Information System (INIS)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.; Batteh, John J; Tiller, Michael M.

    2015-01-01

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individual component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.

  17. Moving objects management models, techniques and applications

    CERN Document Server

    Meng, Xiaofeng; Xu, Jiajie

    2014-01-01

    This book describes the topics of moving objects modeling and location tracking, indexing and querying, clustering, location uncertainty, traffic aware navigation and privacy issues as well as the application to intelligent transportation systems.

  18. Semi-empirical prediction of moisture build-up in an electronic enclosure using analysis of variance (ANOVA)

    DEFF Research Database (Denmark)

    Shojaee Nasirabadi, Parizad; Conseil, Helene; Mohanty, Sankhya

    2016-01-01

    Electronic systems are exposed to harsh environmental conditions such as high humidity in many applications. Moisture transfer into electronic enclosures and condensation can cause several problems as material degradation and corrosion. Therefore, it is important to control the moisture content...... and the relative humidity inside electronic enclosures. In this work, moisture transfer into a typical polycarbonate electronic enclosure with a cylindrical shape opening is studied. The effects of four influential parameters namely, initial relative humidity inside the enclosure, radius and length of the opening...... and temperature are studied. A set of experiments are done based on a fractional factorial design in order to estimate the time constant for moisture transfer into the enclosure by fitting the experimental data to an analytical quasi-steady-state model. According to the statistical analysis, temperature...

  19. Functional Median Polish

    KAUST Repository

    Sun, Ying

    2012-08-03

    This article proposes functional median polish, an extension of univariate median polish, for one-way and two-way functional analysis of variance (ANOVA). The functional median polish estimates the functional grand effect and functional main factor effects based on functional medians in an additive functional ANOVA model assuming no interaction among factors. A functional rank test is used to assess whether the functional main factor effects are significant. The robustness of the functional median polish is demonstrated by comparing its performance with the traditional functional ANOVA fitted by means under different outlier models in simulation studies. The functional median polish is illustrated on various applications in climate science, including one-way and two-way ANOVA when functional data are either curves or images. Specifically, Canadian temperature data, U. S. precipitation observations and outputs of global and regional climate models are considered, which can facilitate the research on the close link between local climate and the occurrence or severity of some diseases and other threats to human health. © 2012 International Biometric Society.

  20. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  1. Molecular modeling and multiscaling issues for electronic material applications

    CERN Document Server

    Iwamoto, Nancy; Yuen, Matthew; Fan, Haibo

    Volume 1 : Molecular Modeling and Multiscaling Issues for Electronic Material Applications provides a snapshot on the progression of molecular modeling in the electronics industry and how molecular modeling is currently being used to understand material performance to solve relevant issues in this field. This book is intended to introduce the reader to the evolving role of molecular modeling, especially seen through the eyes of the IEEE community involved in material modeling for electronic applications.  Part I presents  the role that quantum mechanics can play in performance prediction, such as properties dependent upon electronic structure, but also shows examples how molecular models may be used in performance diagnostics, especially when chemistry is part of the performance issue.  Part II gives examples of large-scale atomistic methods in material failure and shows several examples of transitioning between grain boundary simulations (on the atomistic level)and large-scale models including an example ...

  2. Algebraic Modeling of Topological and Computational Structures and Applications

    CERN Document Server

    Theodorou, Doros; Stefaneas, Petros; Kauffman, Louis

    2017-01-01

    This interdisciplinary book covers a wide range of subjects, from pure mathematics (knots, braids, homotopy theory, number theory) to more applied mathematics (cryptography, algebraic specification of algorithms, dynamical systems) and concrete applications (modeling of polymers and ionic liquids, video, music and medical imaging). The main mathematical focus throughout the book is on algebraic modeling with particular emphasis on braid groups. The research methods include algebraic modeling using topological structures, such as knots, 3-manifolds, classical homotopy groups, and braid groups. The applications address the simulation of polymer chains and ionic liquids, as well as the modeling of natural phenomena via topological surgery. The treatment of computational structures, including finite fields and cryptography, focuses on the development of novel techniques. These techniques can be applied to the design of algebraic specifications for systems modeling and verification. This book is the outcome of a w...

  3. Application of capital replacement models with finite planning horizons

    NARCIS (Netherlands)

    Scarf, P.A.; Christer, A.H.

    1997-01-01

    Capital replacement models with finite planning horizons can be used to model replacement policies in complex operational contexts. They may also be used to investigate the cost consequences of technological change. This paper reviews the application of these models in various such contexts. We also

  4. Property Modelling for Applications in Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    such as database, property model library, model parameter regression, and, property-model based product-process design will be presented. The database contains pure component and mixture data for a wide range of organic chemicals. The property models are based on the combined group contribution and atom...... is missing, the atom connectivity based model is employed to predict the missing group interaction. In this way, a wide application range of the property modeling tool is ensured. Based on the property models, targeted computer-aided techniques have been developed for design and analysis of organic chemicals......, polymers, mixtures as well as separation processes. The presentation will highlight the framework (ICAS software) for property modeling, the property models and issues such as prediction accuracy, flexibility, maintenance and updating of the database. Also, application issues related to the use of property...

  5. Reputation based security model for android applications

    OpenAIRE

    Tesfay, Welderufael Berhane; Booth, Todd; Andersson, Karl

    2012-01-01

    The market for smart phones has been booming in the past few years. There are now over 400,000 applications on the Android market. Over 10 billion Android applications have been downloaded from the Android market. Due to the Android popularity, there are now a large number of malicious vendors targeting the platform. Many honest end users are being successfully hacked on a regular basis. In this work, a cloud based reputation security model has been proposed as a solution which greatly mitiga...

  6. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2017-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  7. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2015-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  8. Comparing sensitivity analysis methods to advance lumped watershed model identification and evaluation

    Directory of Open Access Journals (Sweden)

    Y. Tang

    2007-01-01

    Full Text Available This study seeks to identify sensitivity tools that will advance our understanding of lumped hydrologic models for the purposes of model improvement, calibration efficiency and improved measurement schemes. Four sensitivity analysis methods were tested: (1 local analysis using parameter estimation software (PEST, (2 regional sensitivity analysis (RSA, (3 analysis of variance (ANOVA, and (4 Sobol's method. The methods' relative efficiencies and effectiveness have been analyzed and compared. These four sensitivity methods were applied to the lumped Sacramento soil moisture accounting model (SAC-SMA coupled with SNOW-17. Results from this study characterize model sensitivities for two medium sized watersheds within the Juniata River Basin in Pennsylvania, USA. Comparative results for the 4 sensitivity methods are presented for a 3-year time series with 1 h, 6 h, and 24 h time intervals. The results of this study show that model parameter sensitivities are heavily impacted by the choice of analysis method as well as the model time interval. Differences between the two adjacent watersheds also suggest strong influences of local physical characteristics on the sensitivity methods' results. This study also contributes a comprehensive assessment of the repeatability, robustness, efficiency, and ease-of-implementation of the four sensitivity methods. Overall ANOVA and Sobol's method were shown to be superior to RSA and PEST. Relative to one another, ANOVA has reduced computational requirements and Sobol's method yielded more robust sensitivity rankings.

  9. Developing a java android application of KMV-Merton default rate model

    Science.gov (United States)

    Yusof, Norliza Muhamad; Anuar, Aini Hayati; Isa, Norsyaheeda Natasha; Zulkafli, Sharifah Nursyuhada Syed; Sapini, Muhamad Luqman

    2017-11-01

    This paper presents a developed java android application for KMV-Merton model in predicting the defaut rate of a firm. Predicting default rate is essential in the risk management area as default risk can be immediately transmitted from one entity to another entity. This is the reason default risk is known as a global risk. Although there are several efforts, instruments and methods used to manage the risk, it is said to be insufficient. To the best of our knowledge, there has been limited innovation in developing the default risk mathematical model into a mobile application. Therefore, through this study, default risk is predicted quantitatively using the KMV-Merton model. The KMV-Merton model has been integrated in the form of java program using the Android Studio Software. The developed java android application is tested by predicting the levels of default risk of the three different rated companies. It is found that the levels of default risk are equivalent to the ratings of the respective companies. This shows that the default rate predicted by the KMV-Merton model using the developed java android application can be a significant tool to the risk mangement field. The developed java android application grants users an alternative to predict level of default risk within less procedure.

  10. Formation of an Integrated Stock Price Forecast Model in Lithuania

    Directory of Open Access Journals (Sweden)

    Audrius Dzikevičius

    2016-12-01

    Full Text Available Technical and fundamental analyses are widely used to forecast stock prices due to lack of knowledge of other modern models and methods such as Residual Income Model, ANN-APGARCH, Support Vector Machine, Probabilistic Neural Network and Genetic Fuzzy Systems. Although stock price forecast models integrating both technical and fundamental analyses are currently used widely, their integration is not justified comprehensively enough. This paper discusses theoretical one-factor and multi-factor stock price forecast models already applied by investors at a global level and determines possibility to create and apply practically a stock price forecast model which integrates fundamental and technical analysis with the reference to the Lithuanian stock market. The research is aimed to determine the relationship between stock prices of the 14 Lithuanian companies listed in the Main List by the Nasdaq OMX Baltic and various fundamental variables. Based on correlation and regression analysis results and application of c-Squared Test, ANOVA method, a general stock price forecast model is generated. This paper discusses practical implications how the developed model can be used to forecast stock prices by individual investors and suggests additional check measures.

  11. Applications of the k – ω Model in Stellar Evolutionary Models

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan, E-mail: ly@ynao.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650216 (China)

    2017-05-20

    The k – ω model for turbulence was first proposed by Kolmogorov. A new k – ω model for stellar convection was developed by Li, which could reasonably describe turbulent convection not only in the convectively unstable zone, but also in the overshooting regions. We revised the k – ω model by improving several model assumptions (including the macro-length of turbulence, convective heat flux, and turbulent mixing diffusivity, etc.), making it applicable not only for convective envelopes, but also for convective cores. Eight parameters are introduced in the revised k – ω model. It should be noted that the Reynolds stress (turbulent pressure) is neglected in the equation of hydrostatic support. We applied it into solar models and 5 M {sub ⊙} stellar models to calibrate the eight model parameters, as well as to investigate the effects of the convective overshooting on the Sun and intermediate mass stellar models.

  12. Stochastic linear hybrid systems: Modeling, estimation, and application

    Science.gov (United States)

    Seah, Chze Eng

    Hybrid systems are dynamical systems which have interacting continuous state and discrete state (or mode). Accurate modeling and state estimation of hybrid systems are important in many applications. We propose a hybrid system model, known as the Stochastic Linear Hybrid System (SLHS), to describe hybrid systems with stochastic linear system dynamics in each mode and stochastic continuous-state-dependent mode transitions. We then develop a hybrid estimation algorithm, called the State-Dependent-Transition Hybrid Estimation (SDTHE) algorithm, to estimate the continuous state and discrete state of the SLHS from noisy measurements. It is shown that the SDTHE algorithm is more accurate or more computationally efficient than existing hybrid estimation algorithms. Next, we develop a performance analysis algorithm to evaluate the performance of the SDTHE algorithm in a given operating scenario. We also investigate sufficient conditions for the stability of the SDTHE algorithm. The proposed SLHS model and SDTHE algorithm are illustrated to be useful in several applications. In Air Traffic Control (ATC), to facilitate implementations of new efficient operational concepts, accurate modeling and estimation of aircraft trajectories are needed. In ATC, an aircraft's trajectory can be divided into a number of flight modes. Furthermore, as the aircraft is required to follow a given flight plan or clearance, its flight mode transitions are dependent of its continuous state. However, the flight mode transitions are also stochastic due to navigation uncertainties or unknown pilot intents. Thus, we develop an aircraft dynamics model in ATC based on the SLHS. The SDTHE algorithm is then used in aircraft tracking applications to estimate the positions/velocities of aircraft and their flight modes accurately. Next, we develop an aircraft conformance monitoring algorithm to detect any deviations of aircraft trajectories in ATC that might compromise safety. In this application, the SLHS

  13. Applicability of in silico genotoxicity models on food and feed ingredients.

    Science.gov (United States)

    Vuorinen, Anna; Bellion, Phillip; Beilstein, Paul

    2017-11-01

    Evaluation of the genotoxic potential of food and feed ingredients is required in the development of new substances and for their registration. In addition to in vitro and in vivo assays, in silico tools such as expert alert-based and statistical models can be used for data generation. These in silico models are commonly used among the pharmaceutical industry, whereas the food industry has not widely adopted them. In this study, the applicability of in silico tools for predicting genotoxicity was evaluated, with a focus on bacterial mutagenicity, in vitro and in vivo chromosome damage assays. For this purpose, a test set of 27 food and feed ingredients including vitamins, carotenoids, and nutraceuticals with experimental genotoxicity data was constructed from proprietary data. This dataset was run through multiple models and the model applicability was analyzed. The compounds were generally within the applicability domain of the models and the models predicted the compounds correctly in most of the cases. Although the regulatory acceptance of in silico tools as single data source is still limited, the models are applicable and can be used in the safety evaluation of food and feed ingredients. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Canadian Whole-Farm Model Holos - Development, Stakeholder Involvement, and Model Application

    Science.gov (United States)

    Kroebel, R.; Janzen, H.; Beauchemin, K. A.

    2017-12-01

    Agriculture and Agri-Food Canada's Holos model, based mostly on emission factors, aims to explore the effect of management on Canadian whole-farm greenhouse gas emissions. The model includes 27 commonly grown annual and perennial crops, summer fallow, grassland, and 8 types of tree plantings, along with beef, dairy, sheep, swine and other livestock or poultry operations. Model outputs encompass net emissions of CO2, CH4, and N2O (in CO2 equivalents), calculated for various farm components. Where possible, algorithms are drawn from peer-reviewed publications. For consistency, Holos is aligned with the Canadian sustainability indicator and national greenhouse gas inventory objectives. Although primarily an exploratory tool for research, the model's design makes it accessible and instructive also to agricultural producers, educators, and policy makers. Model development, therefore, proceeds iteratively, with extensive stakeholder feedback from training sessions or annual workshops. To make the model accessible to diverse users, the team developed a multi-layered interface, with general farming scenarios for general use, but giving access to detailed coefficients and assumptions to researchers. The model relies on extensive climate, soil, and agronomic databases to populate regionally-applicable default values thereby minimizing keyboard entries. In an initial application, the model was used to assess greenhouse gas emissions from the Canadian beef production system; it showed that enteric methane accounted for 63% of total GHG emissions and that 84% of emissions originated from the cow-calf herd. The model further showed that GHG emission intensity per kg beef, nationally, declined by 14% from 1981 to 2011, owing to gains in production efficiency. Holos is now being used to consider further potential advances through improved rations or other management options. We are now aiming to expand into questions of grazing management, and are developing a novel carbon

  15. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  16. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  17. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  18. Distributionally Robust Return-Risk Optimization Models and Their Applications

    Directory of Open Access Journals (Sweden)

    Li Yang

    2014-01-01

    Full Text Available Based on the risk control of conditional value-at-risk, distributionally robust return-risk optimization models with box constraints of random vector are proposed. They describe uncertainty in both the distribution form and moments (mean and covariance matrix of random vector. It is difficult to solve them directly. Using the conic duality theory and the minimax theorem, the models are reformulated as semidefinite programming problems, which can be solved by interior point algorithms in polynomial time. An important theoretical basis is therefore provided for applications of the models. Moreover, an application of the models to a practical example of portfolio selection is considered, and the example is evaluated using a historical data set of four stocks. Numerical results show that proposed methods are robust and the investment strategy is safe.

  19. Evaluating the power consumption of wireless sensor network applications using models.

    Science.gov (United States)

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-03-13

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement.

  20. Systems modeling and simulation applications for critical care medicine

    Science.gov (United States)

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  1. Systems modeling and simulation applications for critical care medicine.

    Science.gov (United States)

    Dong, Yue; Chbat, Nicolas W; Gupta, Ashish; Hadzikadic, Mirsad; Gajic, Ognjen

    2012-06-15

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area.

  2. Application of optimization technique for flood damage modeling in river system

    Science.gov (United States)

    Barman, Sangita Deb; Choudhury, Parthasarathi

    2018-04-01

    A river system is defined as a network of channels that drains different parts of a basin uniting downstream to form a common outflow. An application of various models found in literatures, to a river system having multiple upstream flows is not always straight forward, involves a lengthy procedure; and with non-availability of data sets model calibration and applications may become difficult. In the case of a river system the flow modeling can be simplified to a large extent if the channel network is replaced by an equivalent single channel. In the present work optimization model formulations based on equivalent flow and applications of the mixed integer programming based pre-emptive goal programming model in evaluating flood control alternatives for a real life river system in India are proposed to be covered in the study.

  3. Estimation of the applicability domain of kernel-based machine learning models for virtual screening

    Directory of Open Access Journals (Sweden)

    Fechner Nikolas

    2010-03-01

    Full Text Available Abstract Background The virtual screening of large compound databases is an important application of structural-activity relationship models. Due to the high structural diversity of these data sets, it is impossible for machine learning based QSAR models, which rely on a specific training set, to give reliable results for all compounds. Thus, it is important to consider the subset of the chemical space in which the model is applicable. The approaches to this problem that have been published so far mostly use vectorial descriptor representations to define this domain of applicability of the model. Unfortunately, these cannot be extended easily to structured kernel-based machine learning models. For this reason, we propose three approaches to estimate the domain of applicability of a kernel-based QSAR model. Results We evaluated three kernel-based applicability domain estimations using three different structured kernels on three virtual screening tasks. Each experiment consisted of the training of a kernel-based QSAR model using support vector regression and the ranking of a disjoint screening data set according to the predicted activity. For each prediction, the applicability of the model for the respective compound is quantitatively described using a score obtained by an applicability domain formulation. The suitability of the applicability domain estimation is evaluated by comparing the model performance on the subsets of the screening data sets obtained by different thresholds for the applicability scores. This comparison indicates that it is possible to separate the part of the chemspace, in which the model gives reliable predictions, from the part consisting of structures too dissimilar to the training set to apply the model successfully. A closer inspection reveals that the virtual screening performance of the model is considerably improved if half of the molecules, those with the lowest applicability scores, are omitted from the screening

  4. Estimation of the applicability domain of kernel-based machine learning models for virtual screening.

    Science.gov (United States)

    Fechner, Nikolas; Jahn, Andreas; Hinselmann, Georg; Zell, Andreas

    2010-03-11

    The virtual screening of large compound databases is an important application of structural-activity relationship models. Due to the high structural diversity of these data sets, it is impossible for machine learning based QSAR models, which rely on a specific training set, to give reliable results for all compounds. Thus, it is important to consider the subset of the chemical space in which the model is applicable. The approaches to this problem that have been published so far mostly use vectorial descriptor representations to define this domain of applicability of the model. Unfortunately, these cannot be extended easily to structured kernel-based machine learning models. For this reason, we propose three approaches to estimate the domain of applicability of a kernel-based QSAR model. We evaluated three kernel-based applicability domain estimations using three different structured kernels on three virtual screening tasks. Each experiment consisted of the training of a kernel-based QSAR model using support vector regression and the ranking of a disjoint screening data set according to the predicted activity. For each prediction, the applicability of the model for the respective compound is quantitatively described using a score obtained by an applicability domain formulation. The suitability of the applicability domain estimation is evaluated by comparing the model performance on the subsets of the screening data sets obtained by different thresholds for the applicability scores. This comparison indicates that it is possible to separate the part of the chemspace, in which the model gives reliable predictions, from the part consisting of structures too dissimilar to the training set to apply the model successfully. A closer inspection reveals that the virtual screening performance of the model is considerably improved if half of the molecules, those with the lowest applicability scores, are omitted from the screening. The proposed applicability domain formulations

  5. Generalized Linear Models with Applications in Engineering and the Sciences

    CERN Document Server

    Myers, Raymond H; Vining, G Geoffrey; Robinson, Timothy J

    2012-01-01

    Praise for the First Edition "The obvious enthusiasm of Myers, Montgomery, and Vining and their reliance on their many examples as a major focus of their pedagogy make Generalized Linear Models a joy to read. Every statistician working in any area of applied science should buy it and experience the excitement of these new approaches to familiar activities."-Technometrics Generalized Linear Models: With Applications in Engineering and the Sciences, Second Edition continues to provide a clear introduction to the theoretical foundations and key applications of generalized linear models (GLMs). Ma

  6. Application of the Human Activity Assistive Technology model for occupational therapy research.

    Science.gov (United States)

    Giesbrecht, Ed

    2013-08-01

    Theoretical models provide a framework for describing practice and integrating evidence into systematic research. There are few models that relate specifically to the provision of assistive technology in occupational therapy practice. The Human Activity Assistive Technology model is an enduring example that has continued to develop by integrating a social model of disability, concepts from occupational therapy theory and principles of assistive technology adoption and abandonment. This study first describes the core concepts of the Human Activity Assistive Technology model and reviews its development over three successive published versions. A review of the research literature reflects application of the model to clinical practice, study design, outcome measure selection and interpretation of results, particularly among occupational therapists. An evaluative framework is used to critique the adequacy of the Human Activity Assistive Technology model for practice and research, exploring attributes of clarity, simplicity, generality, accessibility and importance. Finally, recommendations are proposed for continued development of the model and research applications. Most of the existing research literature employs the Human Activity Assistive Technology model for background and study design; there is emerging evidence to support the core concepts as predictive factors. Although the concepts are generally simple, clear and applicable to occupational therapy practice and research, evolving terminology and outcomes become more complex with the conflation of integrated theories. The development of the Human Activity Assistive Technology model offers enhanced access and application for occupational therapists, but poses challenges to clarity among concepts. Suggestions are made for further development and applications of the model. © 2013 Occupational Therapy Australia.

  7. Modeling Students' Memory for Application in Adaptive Educational Systems

    Science.gov (United States)

    Pelánek, Radek

    2015-01-01

    Human memory has been thoroughly studied and modeled in psychology, but mainly in laboratory setting under simplified conditions. For application in practical adaptive educational systems we need simple and robust models which can cope with aspects like varied prior knowledge or multiple-choice questions. We discuss and evaluate several models of…

  8. Model Oriented Application Generation for Industrial Control Systems

    CERN Document Server

    Copy, B; Blanco Vinuela, E; Fernandez Adiego, B; Nogueira Ferandes, R; Prieto Barreiro, I

    2011-01-01

    The CERN Unified Industrial Control Systems framework (UNICOS) is a software generation methodology and a collection of development tools that standardizes the design of industrial control applications [1]. A Software Factory, named the UNICOS Application Builder (UAB) [2], was introduced to ease extensibility and maintenance of the framework, introducing a stable metamodel, a set of platformindependent models and platformspecific configurations against which code generation plugins and configuration generation plugins can be written. Such plugins currently target PLC programming environments (Schneider and SIEMENS PLCs) as well as SIEMENS WinCC Open Architecture SCADA (previously known as ETM PVSS) but are being expanded to cover more and more aspects of process control systems. We present what constitutes the UNICOS metamodel and the models in use, how these models can be used to capture knowledge about industrial control systems and how this knowledge can be leveraged to generate both code and configuratio...

  9. Research on mixed network architecture collaborative application model

    Science.gov (United States)

    Jing, Changfeng; Zhao, Xi'an; Liang, Song

    2009-10-01

    When facing complex requirements of city development, ever-growing spatial data, rapid development of geographical business and increasing business complexity, collaboration between multiple users and departments is needed urgently, however conventional GIS software (such as Client/Server model or Browser/Server model) are not support this well. Collaborative application is one of the good resolutions. Collaborative application has four main problems to resolve: consistency and co-edit conflict, real-time responsiveness, unconstrained operation, spatial data recoverability. In paper, application model called AMCM is put forward based on agent and multi-level cache. AMCM can be used in mixed network structure and supports distributed collaborative. Agent is an autonomous, interactive, initiative and reactive computing entity in a distributed environment. Agent has been used in many fields such as compute science and automation. Agent brings new methods for cooperation and the access for spatial data. Multi-level cache is a part of full data. It reduces the network load and improves the access and handle of spatial data, especially, in editing the spatial data. With agent technology, we make full use of its characteristics of intelligent for managing the cache and cooperative editing that brings a new method for distributed cooperation and improves the efficiency.

  10. Integrated climate modelling at the Kiel Institute for World Economics: The DART Model and its applications

    OpenAIRE

    Deke, Oliver; Peterson, Sonja

    2003-01-01

    The aim of this paper is to give an overview over the DART model and its applications. The main focus is on the implementation of climate impacts into DART in the course of coupling DART to the ocean-atmosphere model and on the associated empirical problems. The basic DART model and some applications are presented in the next section. Section 3 describes in detail how the economic impacts of climate change on the agricultural sector and the impact of sea level rise are implemented in DART. Se...

  11. Modelling, simulation and applications of longitudinal train dynamics

    Science.gov (United States)

    Cole, Colin; Spiryagin, Maksym; Wu, Qing; Sun, Yan Quan

    2017-10-01

    Significant developments in longitudinal train simulation and an overview of the approaches to train models and modelling vehicle force inputs are firstly presented. The most important modelling task, that of the wagon connection, consisting of energy absorption devices such as draft gears and buffers, draw gear stiffness, coupler slack and structural stiffness is then presented. Detailed attention is given to the modelling approaches for friction wedge damped and polymer draft gears. A significant issue in longitudinal train dynamics is the modelling and calculation of the input forces - the co-dimensional problem. The need to push traction performances higher has led to research and improvement in the accuracy of traction modelling which is discussed. A co-simulation method that combines longitudinal train simulation, locomotive traction control and locomotive vehicle dynamics is presented. The modelling of other forces, braking propulsion resistance, curve drag and grade forces are also discussed. As extensions to conventional longitudinal train dynamics, lateral forces and coupler impacts are examined in regards to interaction with wagon lateral and vertical dynamics. Various applications of longitudinal train dynamics are then presented. As an alternative to the tradition single wagon mass approach to longitudinal train dynamics, an example incorporating fully detailed wagon dynamics is presented for a crash analysis problem. Further applications of starting traction, air braking, distributed power, energy analysis and tippler operation are also presented.

  12. Microwave propagation and remote sensing atmospheric influences with models and applications

    CERN Document Server

    Karmakar, Pranab Kumar

    2011-01-01

    Because prevailing atmospheric/troposcopic conditions greatly influence radio wave propagation above 10 GHz, the unguided propagation of microwaves in the neutral atmosphere can directly impact many vital applications in science and engineering. These include transmission of intelligence, and radar and radiometric applications used to probe the atmosphere, among others. Where most books address either one or the other, Microwave Propagation and Remote Sensing: Atmospheric Influences with Models and Applications melds coverage of these two subjects to help readers develop solutions to the problems they present. This reference offers a brief, elementary account of microwave propagation through the atmosphere and discusses radiometric applications in the microwave band used to characterize and model atmospheric constituents, which is also known as remote sensing. Summarizing the latest research results in the field, as well as radiometric models and measurement methods, this book covers topics including: Free sp...

  13. Constrained statistical inference : sample-size tables for ANOVA and regression

    NARCIS (Netherlands)

    Vanbrabant, Leonard; Van De Schoot, Rens; Rosseel, Yves

    2015-01-01

    Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient β1 is larger than β2 and β3. The corresponding hypothesis is H: β1 > {β2, β3} and

  14. Semantic Information Modeling for Emerging Applications in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Natarajan, Sreedhar; Simmhan, Yogesh; Prasanna, Viktor

    2012-04-16

    Smart Grid modernizes power grid by integrating digital and information technologies. Millions of smart meters, intelligent appliances and communication infrastructures are under deployment allowing advanced IT applications to be developed to secure and manage power grid operations. Demand response (DR) is one such emerging application to optimize electricity demand by curtailing/shifting power load when peak load occurs. Existing DR approaches are mostly based on static plans such as pricing policies and load shedding schedules. However, improvements to power management applications rely on data emanating from existing and new information sources with the growth of Smart Grid information space. In particular, dynamic DR algorithms depend on information from smart meters that report interval-based power consumption measurement, HVAC systems that monitor buildings heat and humidity, and even weather forecast services. In order for emerging Smart Grid applications to take advantage of the diverse data influx, extensible information integration is required. In this paper, we develop an integrated Smart Grid information model using Semantic Web techniques and present case studies of using semantic information for dynamic DR. We show the semantic model facilitates information integration and knowledge representation for developing the next generation Smart Grid applications.

  15. Wave model downscaling for coastal applications

    Science.gov (United States)

    Valchev, Nikolay; Davidan, Georgi; Trifonova, Ekaterina; Andreeva, Nataliya

    2010-05-01

    Downscaling is a suitable technique for obtaining high-resolution estimates from relatively coarse-resolution global models. Dynamical and statistical downscaling has been applied to the multidecadal simulations of ocean waves. Even as large-scale variability might be plausibly estimated from these simulations, their value for the small scale applications such as design of coastal protection structures and coastal risk assessment is limited due to their relatively coarse spatial and temporal resolutions. Another advantage of the high resolution wave modeling is that it accounts for shallow water effects. Therefore, it can be used for both wave forecasting at specific coastal locations and engineering applications that require knowledge about extreme wave statistics at or near the coastal facilities. In the present study downscaling is applied to both ECMWF and NCEP/NCAR global reanalysis of atmospheric pressure over the Black Sea with 2.5 degrees spatial resolution. A simplified regional atmospheric model is employed for calculation of the surface wind field at 0.5 degrees resolution that serves as forcing for the wave models. Further, a high-resolution nested WAM/SWAN wave model suite of nested wave models is applied for spatial downscaling. It aims at resolving the wave conditions in a limited area at the close proximity to the shore. The pilot site is located in the northern part the Bulgarian Black Sea shore. The system involves the WAM wave model adapted for basin scale simulation at 0.5 degrees spatial resolution. The WAM output for significant wave height, mean wave period and mean angle of wave approach is used in terms of external boundary conditions for the SWAN wave model, which is set up for the western Black Sea shelf at 4km resolution. The same model set up on about 400m resolution is nested to the first SWAN run. In this case the SWAN 2D spectral output provides boundary conditions for the high-resolution model run. The models are implemented for a

  16. The microcomputer scientific software series 2: general linear model--regression.

    Science.gov (United States)

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  17. MaMR: High-performance MapReduce programming model for material cloud applications

    Science.gov (United States)

    Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng

    2017-02-01

    With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.

  18. Current developments in soil organic matter modeling and the expansion of model applications: a review

    International Nuclear Information System (INIS)

    Campbell, Eleanor E; Paustian, Keith

    2015-01-01

    Soil organic matter (SOM) is an important natural resource. It is fundamental to soil and ecosystem functions across a wide range of scales, from site-specific soil fertility and water holding capacity to global biogeochemical cycling. It is also a highly complex material that is sensitive to direct and indirect human impacts. In SOM research, simulation models play an important role by providing a mathematical framework to integrate, examine, and test the understanding of SOM dynamics. Simulation models of SOM are also increasingly used in more ‘applied’ settings to evaluate human impacts on ecosystem function, and to manage SOM for greenhouse gas mitigation, improved soil health, and sustainable use as a natural resource. Within this context, there is a need to maintain a robust connection between scientific developments in SOM modeling approaches and SOM model applications. This need forms the basis of this review. In this review we first provide an overview of SOM modeling, focusing on SOM theory, data-model integration, and model development as evidenced by a quantitative review of SOM literature. Second, we present the landscape of SOM model applications, focusing on examples in climate change policy. We conclude by discussing five areas of recent developments in SOM modeling including: (1) microbial roles in SOM stabilization; (2) modeling SOM saturation kinetics; (3) temperature controls on decomposition; (4) SOM dynamics in deep soil layers; and (5) SOM representation in earth system models. Our aim is to comprehensively connect SOM model development to its applications, revealing knowledge gaps in need of focused interdisciplinary attention and exposing pitfalls that, if avoided, can lead to best use of SOM models to support policy initiatives and sustainable land management solutions. (topical review)

  19. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  20. FUNCTIONAL MODELLING FOR FAULT DIAGNOSIS AND ITS APPLICATION FOR NPP

    Directory of Open Access Journals (Sweden)

    MORTEN LIND

    2014-12-01

    Full Text Available The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM, which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  1. Functional Modelling for fault diagnosis and its application for NPP

    Energy Technology Data Exchange (ETDEWEB)

    Lind, Morten; Zhang, Xin Xin [Dept. of Electrical Engineering, Technical University of Denmark, Kongens Lyngby (Denmark)

    2014-12-15

    The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM), which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  2. Functional Modelling for fault diagnosis and its application for NPP

    International Nuclear Information System (INIS)

    Lind, Morten; Zhang, Xin Xin

    2014-01-01

    The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM), which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  3. Top-Down Enterprise Application Integration with Reference Models

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2000-11-01

    Full Text Available For Enterprise Resource Planning (ERP systems such as SAP R/3 or IBM SanFrancisco, the tailoring of reference models for customizing the ERP systems to specific organizational contexts is an established approach. In this paper, we present a methodology that uses such reference models as a starting point for a top-down integration of enterprise applications. The re-engineered models of legacy systems are individually linked via cross-mapping specifications to the forward-engineered reference model's specification. The actual linking of reference and legacy models is done with a methodology for connecting (new business objects with (old legacy systems.

  4. Reliable real-time applications - and how to use tests to model and understand

    DEFF Research Database (Denmark)

    Jensen, Peter Krogsgaard

    Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application......Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application...

  5. Supply chain management models, applications, and research directions

    CERN Document Server

    Pardalos, Panos; Romeijn, H

    2005-01-01

    This work brings together some of the most up to date research in the application of operations research and mathematical modeling te- niques to problems arising in supply chain management and e-Commerce. While research in the broad area of supply chain management enc- passes a wide range of topics and methodologies, we believe this book provides a good snapshot of current quantitative modeling approaches, issues, and trends within the field. Each chapter is a self-contained study of a timely and relevant research problem in supply chain mana- ment. The individual works place a heavy emphasis on the application of modeling techniques to real world management problems. In many instances, the actual results from applying these techniques in practice are highlighted. In addition, each chapter provides important mana- rial insights that apply to general supply chain management practice. The book is divided into three parts. The first part contains ch- ters that address the new and rapidly growing role of the inte...

  6. Linear and Generalized Linear Mixed Models and Their Applications

    CERN Document Server

    Jiang, Jiming

    2007-01-01

    This book covers two major classes of mixed effects models, linear mixed models and generalized linear mixed models, and it presents an up-to-date account of theory and methods in analysis of these models as well as their applications in various fields. The book offers a systematic approach to inference about non-Gaussian linear mixed models. Furthermore, it has included recently developed methods, such as mixed model diagnostics, mixed model selection, and jackknife method in the context of mixed models. The book is aimed at students, researchers and other practitioners who are interested

  7. Mathematical and numerical foundations of turbulence models and applications

    CERN Document Server

    Chacón Rebollo, Tomás

    2014-01-01

    With applications to climate, technology, and industry, the modeling and numerical simulation of turbulent flows are rich with history and modern relevance. The complexity of the problems that arise in the study of turbulence requires tools from various scientific disciplines, including mathematics, physics, engineering, and computer science. Authored by two experts in the area with a long history of collaboration, this monograph provides a current, detailed look at several turbulence models from both the theoretical and numerical perspectives. The k-epsilon, large-eddy simulation, and other models are rigorously derived and their performance is analyzed using benchmark simulations for real-world turbulent flows. Mathematical and Numerical Foundations of Turbulence Models and Applications is an ideal reference for students in applied mathematics and engineering, as well as researchers in mathematical and numerical fluid dynamics. It is also a valuable resource for advanced graduate students in fluid dynamics,...

  8. Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling

    Science.gov (United States)

    Huber, I.; Archontoulis, S.

    2017-12-01

    In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar

  9. Functional Median Polish

    KAUST Repository

    Sun, Ying; Genton, Marc G.

    2012-01-01

    polish is demonstrated by comparing its performance with the traditional functional ANOVA fitted by means under different outlier models in simulation studies. The functional median polish is illustrated on various applications in climate science

  10. Continuum-Kinetic Models and Numerical Methods for Multiphase Applications

    Science.gov (United States)

    Nault, Isaac Michael

    This thesis presents a continuum-kinetic approach for modeling general problems in multiphase solid mechanics. In this context, a continuum model refers to any model, typically on the macro-scale, in which continuous state variables are used to capture the most important physics: conservation of mass, momentum, and energy. A kinetic model refers to any model, typically on the meso-scale, which captures the statistical motion and evolution of microscopic entitites. Multiphase phenomena usually involve non-negligible micro or meso-scopic effects at the interfaces between phases. The approach developed in the thesis attempts to combine the computational performance benefits of a continuum model with the physical accuracy of a kinetic model when applied to a multiphase problem. The approach is applied to modeling a single particle impact in Cold Spray, an engineering process that intimately involves the interaction of crystal grains with high-magnitude elastic waves. Such a situation could be classified a multiphase application due to the discrete nature of grains on the spatial scale of the problem. For this application, a hyper elasto-plastic model is solved by a finite volume method with approximate Riemann solver. The results of this model are compared for two types of plastic closure: a phenomenological macro-scale constitutive law, and a physics-based meso-scale Crystal Plasticity model.

  11. Wound healing modulators in a tracheoplasty canine model.

    Science.gov (United States)

    Olmos-Zúñiga, J Raúl; Hernández-Jiménez, Claudia; Díaz-Martínez, Emmanuel; Jasso-Victoria, Rogelio; Sotres-Vega, Avelina; Gaxiola-Gaxiola, Miguel O; Villalba-Caloca, Jaime; Baltazares-Lipp, Matilde; Santillán-Doherty, Patricio; Santibáñez-Salgado, J Alfredo

    2007-01-01

    Postsurgical tracheal stenosis results from fibrosis formation due to ischemia. There are healing modulators, hyaluronic acid (HA) and collagen polyvinylpyrrolidone (CPVP), which reduce collagen fibers formation. Thus we can hypothesize that the topical application of one of these modulators can diminish postsurgical tracheal scarring and stenosis. The aim of this work was to evaluate the macroscopic, microscopic, and biochemical changes of tracheal healing after the application of HA or CPVP in a canine tracheoplasty model. The study design was prospective experimental investigation in a canine model. Eighteen mongrel dogs underwent three cervical tracheal rings resection and end-to-end anastomosis. They were randomized into three groups according to treatment: group I (control group) (n = 6), topical application of saline solution on tracheal anastomosis; group II (n = 6), topical application of 15 microg HA on tracheal anastomosis; and group III (n = 6), topical application of 2.5 mg CPVP on tracheal anastomosis. They were evaluated clinical, radiological and tracheoscopically during 4 weeks. They were euthanized at the end of the study time. Macroscopic, microscopic, and biochemical changes of tracheal anastomosis healing were analyzed. Collagen formation was quantified by the Woessner method. All the animals survived the surgical procedure and study period. Macroscopic, radiologic, and endoscopic studies showed that animals in group I developed tracheal stenosis, inflammation, and firm fibrous tissue formation, and histological studies also showed severe inflammatory reaction and fibrosis formation. Groups II (HA) and III (CPVP) showed well-organized thin collagen fibers with minimal inflammatory response. Biochemical evaluation revealed a higher collagen concentration in group I animals (analysis of variance [ANOVA] p anastomosis diminished the degree of stenosis and inflammatory reaction. Both modulators improved tracheal healing.

  12. Design and implementation of space physics multi-model application integration based on web

    Science.gov (United States)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into

  13. Development and application of degradation modeling to define maintenance practices

    International Nuclear Information System (INIS)

    Stock, D.; Samanta, P.; Vesely, W.

    1994-06-01

    This report presents the development and application of component degradation modeling to analyze degradation effects on reliability and to identify aspects of maintenance practices that mitigate degradation and aging effects. Using continuous time Markov approaches, a component degradation model is discussed that includes information about degradation and maintenance. The component model commonly used in probabilistic risk assessments is a simple case of this general model. The parameters used in the general model have engineering interpretations and can be estimated using data and engineering experience. The generation of equations for specific models, the solution of these equations, and a methodology for estimating the needed parameters are all discussed. Applications in this report show how these models can be used to quantitatively assess the benefits that are expected from maintaining a component, the effects of different maintenance efficiencies, the merits of different maintenance policies, and the interaction of surveillance test intervals with maintenance practices

  14. Health Promoting Self-Care Behaviors and Its Related Factors in Elderly: Application of Health Belief Model

    Directory of Open Access Journals (Sweden)

    Mojtaba Azadbakht

    2014-09-01

    Full Text Available Introduction: Health beliefs significantly affect health promoting self-care behaviors. The most important model designed based on health beliefs is the Health Belief Model. This study examined the association between health belief model constructs and demographic factors with behaviors in elderly. Materials and Methods: This descriptive-analytical study was performed on 465 elders referring to Tehran's cultural centers recruited with a multi-stage sampling method. Study instruments were questionnaires regarding demographic information, health beliefs, self-efficacy and health-promoting self-care behaviors. Data analysis was performed using SPSS-22 software by Independent T-test, one-way ANOVA, Pearson correlation and Multiple linear regression. Results: The mean (±SD age of subjects was 68.24±6.12 years and the mean of general self-care score was 1.79±0.36. Gender (P=0.011, economy (P<0.001, education level (P<0.001 and age (P=0.008 were significantly associated with self-care behaviors. Regression analysis showed that perceived barriers, self-efficacy and perceived severity were determinants of behavior (P<0.001. Conclusion: According to the results of this study, it is essential to pay special attention to self-efficacy, perceived severity and perceived barriers to design health education for elderly.

  15. Application of photometric models to asteroids

    International Nuclear Information System (INIS)

    Bowell, E.; Dominque, D.; Hapke, B.

    1989-01-01

    The way an asteroid or other atmosphereless solar system body varies in brightness in response to changing illumination and viewing geometry depends in a very complicated way on the physical and optical properties of its surface and on its overall shape. The authors summarize the formulation and application of recent photometric models by Hapke and by Lumme and Bowell. In both models, the brightness of a rough and porous surface is parametrized in terms of the optical properties of individual particles, by shadowing between particles, and by the way in which light scattered among collections of particles. Both models succeed in their goal of fitting the observed photometric behavior of a wide variety of bodies, but neither has led to a very complete understanding of the properties of asteroid regoliths, primarily because in most cases the parameters in the present models cannot be adequately constrained by observations of integral brightness alone over a restricted range of phase angles

  16. 2D modelling and its applications in engineering

    International Nuclear Information System (INIS)

    Altinbalik, M. Tahir; İRSEL, Gürkan

    2013-01-01

    A model, in computer aided engineering applications, may be created by either using a two- dimensional or a three-dimensional design depending on the purpose of design. What matters most in this regard is the selection of a right method to meet system solution requirements in the most economical way. Manufacturability of a design that is developed by utilising computer aided engineering is important, but usability of the data obtained in the course of design works in the production is also equally important. In the applications consisting of such production operations as CNC or plasma cutting, two-dimensional designs can be directly used in production. These machines are equipped with interfaces which converts two-dimensional drawings into codes. In this way, a design can be directly transferred to production, and any arrangements during production process can be synchronously evaluated. As a result of this, investment expenses will be lowered, and thus the costs can be reduced to some extent. In the presented study, we have studied two-dimensional design applications and requirements. We created a two-dimensional design for a part for which a three-dimensional model have previously been generated, and then, we transferred this design to plasma cutting machine, and thus, the operation has been realized experimentally. Key words: Plasma Cutting, 2D modelling, flexibility

  17. 2D modelling and its applications in engineering

    Energy Technology Data Exchange (ETDEWEB)

    Altinbalik, M. Tahir; İRSEL, Gürkan [Trakya University, Faculty of Engineering and Architecture Mechanical Engineering Department, Edİrne (Turkey)

    2013-07-01

    A model, in computer aided engineering applications, may be created by either using a two- dimensional or a three-dimensional design depending on the purpose of design. What matters most in this regard is the selection of a right method to meet system solution requirements in the most economical way. Manufacturability of a design that is developed by utilising computer aided engineering is important, but usability of the data obtained in the course of design works in the production is also equally important. In the applications consisting of such production operations as CNC or plasma cutting, two-dimensional designs can be directly used in production. These machines are equipped with interfaces which converts two-dimensional drawings into codes. In this way, a design can be directly transferred to production, and any arrangements during production process can be synchronously evaluated. As a result of this, investment expenses will be lowered, and thus the costs can be reduced to some extent. In the presented study, we have studied two-dimensional design applications and requirements. We created a two-dimensional design for a part for which a three-dimensional model have previously been generated, and then, we transferred this design to plasma cutting machine, and thus, the operation has been realized experimentally. Key words: Plasma Cutting, 2D modelling, flexibility.

  18. Models for dryout in debris beds. Review and application to the analysis of PAHR

    International Nuclear Information System (INIS)

    Yamakoshi, Yoshinori

    2000-03-01

    There are many models for dryout in debiris beds and various conditions under which these models are applicable. For a reliable analysis of post-accident heat removal (PAHR), it is important that characteristics and applicability of each model should be made clear. In this report, formulation of the models for dryout and applicability of them are studied through comparing with experimental data. A new model for dryout prediction is also discussed here. It is difficult to predict the dryout power especially for a relatively shallow bed using a conventional model for channeled beds. The new model, which is based on the one-dimensional model derived by Lipinski, has permeability of channels in the governing equation, and enables us to predict the dryout power for relatively shallow beds. The following conclusions are derived from comparing the predicted dryout power with experimental data. The model for series heat removal is applicable to a packed bed while the DEBRIS-MD underestimates the dryout power for it. Either the original model assuming channel formation on the top of the bed or the modified model is applicable to a relatively deep bed with channels. For a relatively shallow bed with channels, the dryout power predicted by the modified model agrees with the experimental data in comparison with other models. (author)

  19. Genetic demographic networks: Mathematical model and applications.

    Science.gov (United States)

    Kimmel, Marek; Wojdyła, Tomasz

    2016-10-01

    Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise

  20. PRACTICAL APPLICATION OF A MODEL FOR ASSESSING

    Directory of Open Access Journals (Sweden)

    Petr NOVOTNÝ

    2015-12-01

    Full Text Available Rail transport is an important sub-sector of transport infrastructure. Disruption of its operation due to emergencies can result in a reduction in functional parameters of provided services with consequent impacts on society. Identification of critical elements of this system enables its timely and effective protection. On that ground, the article presents a draft model for assessing the criticality of railway infrastructure elements. This model uses a systems approach and multicriteria semi-quantitative analysis with weighted criteria for calculating the criticality of individual elements of the railway infrastructure. In the conclusion, it presents a practical application of the proposed model including the discussion of results.

  1. Review of Development Survey of Phase Change Material Models in Building Applications

    Directory of Open Access Journals (Sweden)

    Hussein J. Akeiber

    2014-01-01

    Full Text Available The application of phase change materials (PCMs in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data.

  2. A modified elastic foundation contact model for application in 3D models of the prosthetic knee.

    Science.gov (United States)

    Pérez-González, Antonio; Fenollosa-Esteve, Carlos; Sancho-Bru, Joaquín L; Sánchez-Marín, Francisco T; Vergara, Margarita; Rodríguez-Cervantes, Pablo J

    2008-04-01

    Different models have been used in the literature for the simulation of surface contact in biomechanical knee models. However, there is a lack of systematic comparisons of these models applied to the simulation of a common case, which will provide relevant information about their accuracy and suitability for application in models of the implanted knee. In this work a comparison of the Hertz model (HM), the elastic foundation model (EFM) and the finite element model (FEM) for the simulation of the elastic contact in a 3D model of the prosthetic knee is presented. From the results of this comparison it is found that although the nature of the EFM offers advantages when compared with that of the HM for its application to realistic prosthetic surfaces, and when compared with the FEM in CPU time, its predictions can differ from FEM in some circumstances. These differences are considerable if the comparison is performed for prescribed displacements, although they are less important for prescribed loads. To solve these problems a new modified elastic foundation model (mEFM) is proposed that maintains basically the simplicity of the original model while producing much more accurate results. In this paper it is shown that this new mEFM calculates pressure distribution and contact area with accuracy and short computation times for toroidal contacting surfaces. Although further work is needed to confirm its validity for more complex geometries the mEFM is envisaged as a good option for application in 3D knee models to predict prosthetic knee performance.

  3. Identifying the Factors That Influence Change in SEBD Using Logistic Regression Analysis

    Science.gov (United States)

    Camilleri, Liberato; Cefai, Carmel

    2013-01-01

    Multiple linear regression and ANOVA models are widely used in applications since they provide effective statistical tools for assessing the relationship between a continuous dependent variable and several predictors. However these models rely heavily on linearity and normality assumptions and they do not accommodate categorical dependent…

  4. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    OpenAIRE

    Zee, van der, F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy formation in industrialised market economics. Part II (chapters 8-11) focuses on the empirical applicability of political economy models to agricultural policy formation and agricultural policy developmen...

  5. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  6. Multi-Resolution Multimedia QoE Models for IPTV Applications

    Directory of Open Access Journals (Sweden)

    Prasad Calyam

    2012-01-01

    Full Text Available Internet television (IPTV is rapidly gaining popularity and is being widely deployed in content delivery networks on the Internet. In order to proactively deliver optimum user quality of experience (QoE for IPTV, service providers need to identify network bottlenecks in real time. In this paper, we develop psycho-acoustic-visual models that can predict user QoE of multimedia applications in real time based on online network status measurements. Our models are neural network based and cater to multi-resolution IPTV applications that include QCIF, QVGA, SD, and HD resolutions encoded using popular audio and video codec combinations. On the network side, our models account for jitter and loss levels, as well as router queuing disciplines: packet-ordered and time-ordered FIFO. We evaluate the performance of our multi-resolution multimedia QoE models in terms of prediction characteristics, accuracy, speed, and consistency. Our evaluation results demonstrate that the models are pertinent for real-time QoE monitoring and resource adaptation in IPTV content delivery networks.

  7. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  8. Applicability of common stomatal conductance models in maize under varying soil moisture conditions.

    Science.gov (United States)

    Wang, Qiuling; He, Qijin; Zhou, Guangsheng

    2018-07-01

    In the context of climate warming, the varying soil moisture caused by precipitation pattern change will affect the applicability of stomatal conductance models, thereby affecting the simulation accuracy of carbon-nitrogen-water cycles in ecosystems. We studied the applicability of four common stomatal conductance models including Jarvis, Ball-Woodrow-Berry (BWB), Ball-Berry-Leuning (BBL) and unified stomatal optimization (USO) models based on summer maize leaf gas exchange data from a soil moisture consecutive decrease manipulation experiment. The results showed that the USO model performed best, followed by the BBL model, BWB model, and the Jarvis model performed worst under varying soil moisture conditions. The effects of soil moisture made a difference in the relative performance among the models. By introducing a water response function, the performance of the Jarvis, BWB, and USO models improved, which decreased the normalized root mean square error (NRMSE) by 15.7%, 16.6% and 3.9%, respectively; however, the performance of the BBL model was negative, which increased the NRMSE by 5.3%. It was observed that the models of Jarvis, BWB, BBL and USO were applicable within different ranges of soil relative water content (i.e., 55%-65%, 56%-67%, 37%-79% and 37%-95%, respectively) based on the 95% confidence limits. Moreover, introducing a water response function, the applicability of the Jarvis and BWB models improved. The USO model performed best with or without introducing the water response function and was applicable under varying soil moisture conditions. Our results provide a basis for selecting appropriate stomatal conductance models under drought conditions. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hsu, F.; Subduhi, M.; Vesely, W.E.

    1990-01-01

    This paper describes a modeling approach to analyze component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends. 2 refs., 8 figs

  10. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Hsu, F.; Subudhi, M.

    1991-01-01

    This paper describes a modeling approach to analyze light water reactor component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends

  11. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  12. Application of a novel cellular automaton porosity prediction model to aluminium castings

    International Nuclear Information System (INIS)

    Atwood, R.C.; Chirazi, A.; Lee, P.D.

    2002-01-01

    A multiscale model was developed to predict the formation of porosity within a solidifying aluminium-silicon alloy. The diffusion of silicon and dissolved gas was simulated on a microscopic scale combined with cellular automaton models of gas porosity formation within the growing three-dimensional solidification microstructure. However, due to high computational cost, the modelled volume is limited to the millimetre range. This renders the application of direct modelling of complex shape castings unfeasible. Combining the microstructural modelling with a statistical response-surface prediction method allows application of the microstructural model results to industrial scale casts by incorporating them in commercial solidification software. (author)

  13. Multivariate Birnbaum-Saunders Distributions: Modelling and Applications

    Directory of Open Access Journals (Sweden)

    Robert G. Aykroyd

    2018-03-01

    Full Text Available Since its origins and numerous applications in material science, the Birnbaum–Saunders family of distributions has now found widespread uses in some areas of the applied sciences such as agriculture, environment and medicine, as well as in quality control, among others. It is able to model varied data behaviour and hence provides a flexible alternative to the most usual distributions. The family includes Birnbaum–Saunders and log-Birnbaum–Saunders distributions in univariate and multivariate versions. There are now well-developed methods for estimation and diagnostics that allow in-depth analyses. This paper gives a detailed review of existing methods and of relevant literature, introducing properties and theoretical results in a systematic way. To emphasise the range of suitable applications, full analyses are included of examples based on regression and diagnostics in material science, spatial data modelling in agricultural engineering and control charts for environmental monitoring. However, potential future uses in new areas such as business, economics, finance and insurance are also discussed. This work is presented to provide a full tool-kit of novel statistical models and methods to encourage other researchers to implement them in these new areas. It is expected that the methods will have the same positive impact in the new areas as they have had elsewhere.

  14. Application of Markowitz Model on Romanian Stock Market

    Directory of Open Access Journals (Sweden)

    Zavera Ioana Coralia

    2017-04-01

    Full Text Available Performance evaluation of financial instruments has become a concern for more and more economists, while security trading activities have developed over time. “Modern portfolio theory” comprises statistical and mathematical models which describe various ways in order to evaluate and especially analyse profitability and risk of these portfolios. This article offers an application of this type of model on Romanian stock market, the Markowitz model, by focusing on portfolios comprising three securities, and determining the efficient frontier and the minimum variance portfolio.

  15. Encoding Dissimilarity Data for Statistical Model Building.

    Science.gov (United States)

    Wahba, Grace

    2010-12-01

    We summarize, review and comment upon three papers which discuss the use of discrete, noisy, incomplete, scattered pairwise dissimilarity data in statistical model building. Convex cone optimization codes are used to embed the objects into a Euclidean space which respects the dissimilarity information while controlling the dimension of the space. A "newbie" algorithm is provided for embedding new objects into this space. This allows the dissimilarity information to be incorporated into a Smoothing Spline ANOVA penalized likelihood model, a Support Vector Machine, or any model that will admit Reproducing Kernel Hilbert Space components, for nonparametric regression, supervised learning, or semi-supervised learning. Future work and open questions are discussed. The papers are: F. Lu, S. Keles, S. Wright and G. Wahba 2005. A framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences 102, 12332-1233.G. Corrada Bravo, G. Wahba, K. Lee, B. Klein, R. Klein and S. Iyengar 2009. Examining the relative influence of familial, genetic and environmental covariate information in flexible risk models. Proceedings of the National Academy of Sciences 106, 8128-8133F. Lu, Y. Lin and G. Wahba. Robust manifold unfolding with kernel regularization. TR 1008, Department of Statistics, University of Wisconsin-Madison.

  16. Reviewing model application to support animal health decision making.

    Science.gov (United States)

    Singer, Alexander; Salman, Mo; Thulke, Hans-Hermann

    2011-04-01

    Animal health is of societal importance as it affects human welfare, and anthropogenic interests shape decision making to assure animal health. Scientific advice to support decision making is manifold. Modelling, as one piece of the scientific toolbox, is appreciated for its ability to describe and structure data, to give insight in complex processes and to predict future outcome. In this paper we study the application of scientific modelling to support practical animal health decisions. We reviewed the 35 animal health related scientific opinions adopted by the Animal Health and Animal Welfare Panel of the European Food Safety Authority (EFSA). Thirteen of these documents were based on the application of models. The review took two viewpoints, the decision maker's need and the modeller's approach. In the reviewed material three types of modelling questions were addressed by four specific model types. The correspondence between tasks and models underpinned the importance of the modelling question in triggering the modelling approach. End point quantifications were the dominating request from decision makers, implying that prediction of risk is a major need. However, due to knowledge gaps corresponding modelling studies often shed away from providing exact numbers. Instead, comparative scenario analyses were performed, furthering the understanding of the decision problem and effects of alternative management options. In conclusion, the most adequate scientific support for decision making - including available modelling capacity - might be expected if the required advice is clearly stated. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. An ontology model for execution records of Grid scientific applications

    NARCIS (Netherlands)

    Baliś, B.; Bubak, M.

    2008-01-01

    Records of past application executions are particularly important in the case of loosely-coupled, workflow driven scientific applications which are used to conduct in silico experiments, often on top of Grid infrastructures. In this paper, we propose an ontology-based model for storing and querying

  18. Modelling of a cross flow evaporator for CSP application

    DEFF Research Database (Denmark)

    Sørensen, Kim; Franco, Alessandro; Pelagotti, Leonardo

    2016-01-01

    ) applications. Heat transfer and pressure drop prediction methods are an important tool for design and modelling of diabatic, two-phase, shell-side flow over a horizontal plain tubes bundle for a vertical up-flow evaporator. With the objective of developing a model for a specific type of cross flow evaporator...... the available correlations for the definition of two-phase flow heat transfer, void fraction and pressure drop in connection with the operation of steam generators, focuses attention on a comparison of the results obtained using several different models resulting by different combination of correlations......Heat exchangers consisting of bundles of horizontal plain tubes with boiling on the shell side are widely used in industrial and energy systems applications. A recent particular specific interest for the use of this special heat exchanger is in connection with Concentrated Solar Power (CSP...

  19. Introduction: Special issue on advances in topobathymetric mapping, models, and applications

    Science.gov (United States)

    Gesch, Dean B.; Brock, John C.; Parrish, Christopher E.; Rogers, Jeffrey N.; Wright, C. Wayne

    2016-01-01

    Detailed knowledge of near-shore topography and bathymetry is required for many geospatial data applications in the coastal environment. New data sources and processing methods are facilitating development of seamless, regional-scale topobathymetric digital elevation models. These elevation models integrate disparate multi-sensor, multi-temporal topographic and bathymetric datasets to provide a coherent base layer for coastal science applications such as wetlands mapping and monitoring, sea-level rise assessment, benthic habitat mapping, erosion monitoring, and storm impact assessment. The focus of this special issue is on recent advances in the source data, data processing and integration methods, and applications of topobathymetric datasets.

  20. Models and applications of the UEDGE code

    International Nuclear Information System (INIS)

    Rensink, M.E.; Knoll, D.A.; Porter, G.D.; Rognlien, T.D.; Smith, G.R.; Wising, F.

    1996-09-01

    The transport of particles and energy from the core of a tokamak to nearby material surfaces is an important problem for understanding present experiments and for designing reactor-grade devices. A number of fluid transport codes have been developed to model the plasma in the edge and scrape-off layer (SOL) regions. This report will focus on recent model improvements and illustrative results from the UEDGE code. Some geometric and mesh considerations are introduced, followed by a general description of the plasma and neutral fluid models. A few comments on computational issues are given and then two important applications are illustrated concerning benchmarking and the ITER radiative divertor. Finally, we report on some recent work to improve the models in UEDGE by coupling to a Monte Carlo neutrals code and by utilizing an adaptive grid

  1. Applications of model theory to functional analysis

    CERN Document Server

    Iovino, Jose

    2014-01-01

    During the last two decades, methods that originated within mathematical logic have exhibited powerful applications to Banach space theory, particularly set theory and model theory. This volume constitutes the first self-contained introduction to techniques of model theory in Banach space theory. The area of research has grown rapidly since this monograph's first appearance, but much of this material is still not readily available elsewhere. For instance, this volume offers a unified presentation of Krivine's theorem and the Krivine-Maurey theorem on stable Banach spaces, with emphasis on the

  2. The effects of training based on BASNEF model and acupressure at GB21 point on the infants’ physical growth indicators

    Directory of Open Access Journals (Sweden)

    marzieh akbarzadeh

    2014-08-01

    Full Text Available objective: Educational models are used to study the behavior and plan for changing and determining the factors that affect the individuals’ decision making for conducting a behavior. This study aimed to compare the effects of the educational program based on BASNEF model and acupressure at GB21 point on the infants’ physical growth indicators. Methods: This clinical trial was conducted on 150 (50 per group pregnant women in 2011-2012. The interventions included educational program based on the BASNEF model and application of acupressure at GB21 point. The infants’ physical indicators were compared to the control group one and three months after birth. The study data were analyzed using repeated measurement test, paired sample T-Test, one-way ANOVA, and Tukey’s test. finding: The results showed a significant difference between the intervention and the control group regarding the infants’ weight and height one and three months after birth (p0.05. Also, no significant difference was observed among the three groups concerning the infants’ head and arm circumference (P>0.05. Conclusion: BASNEF model improved the infants’ height and weight. Application of acupressure also improved the infants’ height, weight, and head and arm circumference compared to the control group. Hence, learning and application of techniques and models by the medical team are highly essential.

  3. Examples of mixed-effects modeling with crossed random effects and with binomial data

    NARCIS (Netherlands)

    Quené, H.; van den Bergh, H.

    2008-01-01

    Psycholinguistic data are often analyzed with repeated-measures analyses of variance (ANOVA), but this paper argues that mixed-effects (multilevel) models provide a better alternative method. First, models are discussed in which the two random factors of participants and items are crossed, and not

  4. Estimating linear effects in ANOVA designs: the easy way.

    Science.gov (United States)

    Pinhas, Michal; Tzelgov, Joseph; Ganor-Stern, Dana

    2012-09-01

    Research in cognitive science has documented numerous phenomena that are approximated by linear relationships. In the domain of numerical cognition, the use of linear regression for estimating linear effects (e.g., distance and SNARC effects) became common following Fias, Brysbaert, Geypens, and d'Ydewalle's (1996) study on the SNARC effect. While their work has become the model for analyzing linear effects in the field, it requires statistical analysis of individual participants and does not provide measures of the proportions of variability accounted for (cf. Lorch & Myers, 1990). In the present methodological note, using both the distance and SNARC effects as examples, we demonstrate how linear effects can be estimated in a simple way within the framework of repeated measures analysis of variance. This method allows for estimating effect sizes in terms of both slope and proportions of variability accounted for. Finally, we show that our method can easily be extended to estimate linear interaction effects, not just linear effects calculated as main effects.

  5. Clinical application of the five-factor model.

    Science.gov (United States)

    Widiger, Thomas A; Presnall, Jennifer Ruth

    2013-12-01

    The Five-Factor Model (FFM) has become the predominant dimensional model of general personality structure. The purpose of this paper is to suggest a clinical application. A substantial body of research indicates that the personality disorders included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) can be understood as extreme and/or maladaptive variants of the FFM (the acronym "DSM" refers to any particular edition of the APA DSM). In addition, the current proposal for the forthcoming fifth edition of the DSM (i.e., DSM-5) is shifting closely toward an FFM dimensional trait model of personality disorder. Advantages of this shifting conceptualization are discussed, including treatment planning. © 2012 Wiley Periodicals, Inc.

  6. The sound of friction: Real-time models, playability and musical applications

    Science.gov (United States)

    Serafin, Stefania

    Friction, the tangential force between objects in contact, in most engineering applications needs to be removed as a source of noise and instabilities. In musical applications, friction is a desirable component, being the sound production mechanism of different musical instruments such as bowed strings, musical saws, rubbed bowls and any other sonority produced by interactions between rubbed dry surfaces. The goal of the dissertation is to simulate different instrument whose main excitation mechanism is friction. An efficient yet accurate model of a bowed string instrument, which combines the latest results in violin acoustics with the efficient digital waveguide approach, is provided. In particular, the bowed string physical model proposed uses a thermodynamic friction model in which the finite width of the bow is taken into account; this solution is compared to the recently developed elasto-plastic friction models used in haptics and robotics. Different solutions are also proposed to model the body of the instrument. Other less common instruments driven by friction are also proposed, and the elasto-plastic model is used to provide audio-visual simulations of everyday friction sounds such as squeaking doors and rubbed wine glasses. Finally, playability evaluations and musical applications in which the models have been used are discussed.

  7. Systems and models with anticipation in physics and its applications

    International Nuclear Information System (INIS)

    Makarenko, A

    2012-01-01

    Investigations of recent physics processes and real applications of models require the new more and more improved models which should involved new properties. One of such properties is anticipation (that is taking into accounting some advanced effects).It is considered the special kind of advanced systems – namely a strong anticipatory systems introduced by D. Dubois. Some definitions, examples and peculiarities of solutions are described. The main feature is presumable multivaluedness of the solutions. Presumable physical examples of such systems are proposed: self-organization problems; dynamical chaos; synchronization; advanced potentials; structures in micro-, meso- and macro- levels; cellular automata; computing; neural network theory. Also some applications for modeling social, economical, technical and natural systems are described.

  8. Animal models of enterovirus 71 infection: applications and limitations

    Science.gov (United States)

    2014-01-01

    Human enterovirus 71 (EV71) has emerged as a neuroinvasive virus that is responsible for several outbreaks in the Asia-Pacific region over the past 15 years. Appropriate animal models are needed to understand EV71 neuropathogenesis better and to facilitate the development of effective vaccines and drugs. Non-human primate models have been used to characterize and evaluate the neurovirulence of EV71 after the early outbreaks in late 1990s. However, these models were not suitable for assessing the neurovirulence level of the virus and were associated with ethical and economic difficulties in terms of broad application. Several strategies have been applied to develop mouse models of EV71 infection, including strategies that employ virus adaption and immunodeficient hosts. Although these mouse models do not closely mimic human disease, they have been applied to determine the pathogenesis of and treatment and prevention of the disease. EV71 receptor-transgenic mouse models have recently been developed and have significantly advanced our understanding of the biological features of the virus and the host-parasite interactions. Overall, each of these models has advantages and disadvantages, and these models are differentially suited for studies of EV71 pathogenesis and/or the pre-clinical testing of antiviral drugs and vaccines. In this paper, we review the characteristics, applications and limitation of these EV71 animal models, including non-human primate and mouse models. PMID:24742252

  9. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  10. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo [KAERI, Taejon (Korea, Republic of); Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho [Samchang Enterprise Co., Ltd., Taejon (Korea, Republic of)

    2003-10-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment.

  11. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    International Nuclear Information System (INIS)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo; Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho

    2003-01-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment

  12. Chapter 5: Summary of model application

    International Nuclear Information System (INIS)

    1995-01-01

    This chapter provides a brief summary of the model applications described in Volume III of the Final Report. This chapter dealt with the selected water management regimes; ground water flow regimes; agriculture; ground water quality; hydrodynamics, sediment transport and water quality in the Danube; hydrodynamics, sediment transport and water quality in the river branch system; hydrodynamics, sediment transport and water quality in the Hrusov reservoir and with ecology in this Danube area

  13. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  14. Study on team evaluation (5). On application of behavior observation-based teamwork evaluation sheet for power plant operator team

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Sugihara, Yoshikuni

    2009-01-01

    This report discusses the range of application of the behavior observation-based teamwork evaluation sheet. Under the concept of this method, teamwork evaluation sheet is developed, which assumes a certain single failure (failure of feed water transmitter). The evaluation sheets are applied to evaluate team work of 26 thermal power plant operator teams in combined under abnormal operating conditions of failure of feed water transmitter, feed draft fan or steam flow governor. As a result of ANOVA, it finds that there are no differences between 3 kinds of single failure. In addition, the similar analysis is executed to 3 kinds of multiple failures (steam generator tube rapture, loss of coolant accident and loss of secondary coolant accident) under which 7 PWR nuclear power plant operator teams are evaluated. As a result, ANOVA shows no differences between 3 kinds of multiple failures. These results indicate that a behavior observation-based team work evaluation sheet, which is designed for a certain abnormal condition, is applicable to the abnormal conditions that have the same development of abnormal conditions. (author)

  15. On the upscaling of process-based models in deltaic applications

    Science.gov (United States)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  16. Neural Network Based Models for Fusion Applications

    Science.gov (United States)

    Meneghini, Orso; Tema Biwole, Arsene; Luda, Teobaldo; Zywicki, Bailey; Rea, Cristina; Smith, Sterling; Snyder, Phil; Belli, Emily; Staebler, Gary; Canty, Jeff

    2017-10-01

    Whole device modeling, engineering design, experimental planning and control applications demand models that are simultaneously physically accurate and fast. This poster reports on the ongoing effort towards the development and validation of a series of models that leverage neural-­network (NN) multidimensional regression techniques to accelerate some of the most mission critical first principle models for the fusion community, such as: the EPED workflow for prediction of the H-Mode and Super H-Mode pedestal structure the TGLF and NEO models for the prediction of the turbulent and neoclassical particle, energy and momentum fluxes; and the NEO model for the drift-kinetic solution of the bootstrap current. We also applied NNs on DIII-D experimental data for disruption prediction and quantifying the effect of RMPs on the pedestal and ELMs. All of these projects were supported by the infrastructure provided by the OMFIT integrated modeling framework. Work supported by US DOE under DE-SC0012656, DE-FG02-95ER54309, DE-FC02-04ER54698.

  17. Applications of Multilevel Structural Equation Modeling to Cross-Cultural Research

    Science.gov (United States)

    Cheung, Mike W.-L.; Au, Kevin

    2005-01-01

    Multilevel structural equation modeling (MSEM) has been proposed as an extension to structural equation modeling for analyzing data with nested structure. We have begun to see a few applications in cross-cultural research in which MSEM fits well as the statistical model. However, given that cross-cultural studies can only afford collecting data…

  18. Mathematical annuity models application in cash flow analysis ...

    African Journals Online (AJOL)

    Mathematical annuity models application in cash flow analysis. ... We also compare the cost efficiency between Amortisation and Sinking fund loan repayment as prevalent in financial institutions. Keywords: Annuity, Amortisation, Sinking Fund, Present and Future Value Annuity, Maturity date and Redemption value.

  19. Credibilistic programming an introduction to models and applications

    CERN Document Server

    2014-01-01

    It provides fuzzy programming approach to solve real-life decision problems in fuzzy environment. Within the framework of credibility theory, it provides a self-contained, comprehensive and up-to-date presentation of fuzzy programming models, algorithms and applications in portfolio analysis.

  20. New applications of a model of electromechanical impedance for SHM

    Science.gov (United States)

    Pavelko, Vitalijs

    2014-03-01

    The paper focuses on the further development of the model of the electromechanical impedance (EMI) of the piezoceramics transducer (PZT) and its application for aircraft structural health monitoring (SHM). There was obtained an expression of the electromechanical impedance common to any dimension of models (1D, 2D, 3D), and directly independent from imposed constraints. Determination of the dynamic response of the system "host structure - PZT", which is crucial for the practical application supposes the use of modal analysis. This allows to get a general tool to determine EMI regardless of the specific features of a particular application. Earlier there was considered the technology of separate determination of the dynamic response for the PZT and the structural element". Here another version that involves the joint modal analysis of the entire system "host structure - PZT" is presented. As a result, the dynamic response is obtained in the form of modal decomposition of transducer mechanical strains. The use of models for the free and constrained transducer, analysis of the impact of the adhesive layer to the EMI is demonstrated. In all cases there was analyzed the influence of the dimension of the model (2D and 3D). The validity of the model is confirmed by experimental studies. Correlation between the fatigue crack length in a thin-walled Al plate and EMI of embedded PZT was simulated and compared with test result.

  1. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  2. Review on the HVAC System Modeling Types and the Shortcomings of Their Application

    Directory of Open Access Journals (Sweden)

    Raad Z. Homod

    2013-01-01

    Full Text Available The modeling of the heating, ventilation, and air conditioning (HVAC system is a prominent topic because of its relationship with energy savings and environmental, economical, and technological issues. The modeling of the HVAC system is concerned with the indoor thermal sensation, which is related to the modeling of building, air handling unit (AHU equipments, and indoor thermal processes. Until now, many HVAC system modeling approaches are made available, and the techniques have become quite mature. But there are some shortcomings in application and integration methods for the different types of the HVAC model. The application and integration processes will act to accumulate the defective characteristics for both AHU equipments and building models such as nonlinear, pure lag time, high thermal inertia, uncertain disturbance factors, large-scale systems, and constraints. This paper shows types of the HVAC model and the advantages and disadvantages for each application of them, and it finds out that the gray-box type is the best one to represent the indoor thermal comfort. But its application fails at the integration method where its response deviated to unreal behavior.

  3. Using the object modeling system for hydrological model development and application

    Directory of Open Access Journals (Sweden)

    S. Kralisch

    2005-01-01

    Full Text Available State of the art challenges in sustainable management of water resources have created demand for integrated, flexible and easy to use hydrological models which are able to simulate the quantitative and qualitative aspects of the hydrological cycle with a sufficient degree of certainty. Existing models which have been de-veloped to fit these needs are often constrained to specific scales or purposes and thus can not be easily adapted to meet different challenges. As a solution for flexible and modularised model development and application, the Object Modeling System (OMS has been developed in a joint approach by the USDA-ARS, GPSRU (Fort Collins, CO, USA, USGS (Denver, CO, USA, and the FSU (Jena, Germany. The OMS provides a modern modelling framework which allows the implementation of single process components to be compiled and applied as custom tailored model assemblies. This paper describes basic principles of the OMS and its main components and explains in more detail how the problems during coupling of models or model components are solved inside the system. It highlights the integration of different spatial and temporal scales by their representation as spatial modelling entities embedded into time compound components. As an exam-ple the implementation of the hydrological model J2000 is discussed.

  4. Remote sensing sensors and applications in environmental resources mapping and modeling

    Science.gov (United States)

    Melesse, Assefa M.; Weng, Qihao; Thenkabail, Prasad S.; Senay, Gabriel B.

    2007-01-01

    The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling.

  5. Simple mathematical models of symmetry breaking. Application to particle physics

    International Nuclear Information System (INIS)

    Michel, L.

    1976-01-01

    Some mathematical facts relevant to symmetry breaking are presented. A first mathematical model deals with the smooth action of compact Lie groups on real manifolds, a second model considers linear action of any group on real or complex finite dimensional vector spaces. Application of the mathematical models to particle physics is considered. (B.R.H.)

  6. Structural equation modeling with EQS basic concepts, applications, and programming

    CERN Document Server

    Byrne, Barbara M

    2013-01-01

    Readers who want a less mathematical alternative to the EQS manual will find exactly what they're looking for in this practical text. Written specifically for those with little to no knowledge of structural equation modeling (SEM) or EQS, the author's goal is to provide a non-mathematical introduction to the basic concepts of SEM by applying these principles to EQS, Version 6.1. The book clearly demonstrates a wide variety of SEM/EQS applications that include confirmatory factor analytic and full latent variable models. Written in a "user-friendly" style, the author "walks" the reader through the varied steps involved in the process of testing SEM models: model specification and estimation, assessment of model fit, EQS output, and interpretation of findings. Each of the book's applications is accompanied by: a statement of the hypothesis being tested, a schematic representation of the model, explanations of the EQS input and output files, tips on how to use the pull-down menus, and the data file upon which ...

  7. Hydrodynamic Modeling and Its Application in AUC.

    Science.gov (United States)

    Rocco, Mattia; Byron, Olwyn

    2015-01-01

    The hydrodynamic parameters measured in an AUC experiment, s(20,w) and D(t)(20,w)(0), can be used to gain information on the solution structure of (bio)macromolecules and their assemblies. This entails comparing the measured parameters with those that can be computed from usually "dry" structures by "hydrodynamic modeling." In this chapter, we will first briefly put hydrodynamic modeling in perspective and present the basic physics behind it as implemented in the most commonly used methods. The important "hydration" issue is also touched upon, and the distinction between rigid bodies versus those for which flexibility must be considered in the modeling process is then made. The available hydrodynamic modeling/computation programs, HYDROPRO, BEST, SoMo, AtoB, and Zeno, the latter four all implemented within the US-SOMO suite, are described and their performance evaluated. Finally, some literature examples are presented to illustrate the potential applications of hydrodynamics in the expanding field of multiresolution modeling. © 2015 Elsevier Inc. All rights reserved.

  8. Description and availability of the SMARTS spectral model for photovoltaic applications

    Science.gov (United States)

    Myers, Daryl R.; Gueymard, Christian A.

    2004-11-01

    Limited spectral response range of photocoltaic (PV) devices requires device performance be characterized with respect to widely varying terrestrial solar spectra. The FORTRAN code "Simple Model for Atmospheric Transmission of Sunshine" (SMARTS) was developed for various clear-sky solar renewable energy applications. The model is partly based on parameterizations of transmittance functions in the MODTRAN/LOWTRAN band model family of radiative transfer codes. SMARTS computes spectra with a resolution of 0.5 nanometers (nm) below 400 nm, 1.0 nm from 400 nm to 1700 nm, and 5 nm from 1700 nm to 4000 nm. Fewer than 20 input parameters are required to compute spectral irradiance distributions including spectral direct beam, total, and diffuse hemispherical radiation, and up to 30 other spectral parameters. A spreadsheet-based graphical user interface can be used to simplify the construction of input files for the model. The model is the basis for new terrestrial reference spectra developed by the American Society for Testing and Materials (ASTM) for photovoltaic and materials degradation applications. We describe the model accuracy, functionality, and the availability of source and executable code. Applications to PV rating and efficiency and the combined effects of spectral selectivity and varying atmospheric conditions are briefly discussed.

  9. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    NARCIS (Netherlands)

    Zee, van der F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy

  10. A comprehensive model for piezoceramic actuators: modelling, validation and application

    International Nuclear Information System (INIS)

    Quant, Mario; Elizalde, Hugo; Flores, Abiud; Ramírez, Ricardo; Orta, Pedro; Song, Gangbing

    2009-01-01

    This paper presents a comprehensive model for piezoceramic actuators (PAs), which accounts for hysteresis, non-linear electric field and dynamic effects. The hysteresis model is based on the widely used general Maxwell slip model, while an enhanced electro-mechanical non-linear model replaces the linear constitutive equations commonly used. Further on, a linear second order model compensates the frequency response of the actuator. Each individual model is fully characterized from experimental data yielded by a specific PA, then incorporated into a comprehensive 'direct' model able to determine the output strain based on the applied input voltage, fully compensating the aforementioned effects, where the term 'direct' represents an electrical-to-mechanical operating path. The 'direct' model was implemented in a Matlab/Simulink environment and successfully validated via experimental results, exhibiting higher accuracy and simplicity than many published models. This simplicity would allow a straightforward inclusion of other behaviour such as creep, ageing, material non-linearity, etc, if such parameters are important for a particular application. Based on the same formulation, two other models are also presented: the first is an 'alternate' model intended to operate within a force-controlled scheme (instead of a displacement/position control), thus able to capture the complex mechanical interactions occurring between a PA and its host structure. The second development is an 'inverse' model, able to operate within an open-loop control scheme, that is, yielding a 'linearized' PA behaviour. The performance of the developed models is demonstrated via a numerical sample case simulated in Matlab/Simulink, consisting of a PA coupled to a simple mechanical system, aimed at shifting the natural frequency of the latter

  11. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  12. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  13. Taking the mystery out of mathematical model applications to karst aquifers—A primer

    Science.gov (United States)

    Kuniansky, Eve L.

    2014-01-01

    Advances in mathematical model applications toward the understanding of the complex flow, characterization, and water-supply management issues for karst aquifers have occurred in recent years. Different types of mathematical models can be applied successfully if appropriate information is available and the problems are adequately identified. The mathematical approaches discussed in this paper are divided into three major categories: 1) distributed parameter models, 2) lumped parameter models, and 3) fitting models. The modeling approaches are described conceptually with examples (but without equations) to help non-mathematicians understand the applications.

  14. Application of adversarial risk analysis model in pricing strategies with remanufacturing

    Directory of Open Access Journals (Sweden)

    Liurui Deng

    2015-01-01

    Full Text Available Purpose: Purpose: This paper mainly focus on the application of adversarial risk analysis (ARA in pricing strategy with remanufacturing. We hope to obtain more realistic results than classical model. Moreover, we also wish that our research improve the development of ARA in pricing strategy of manufacturing or remanufacturing. Approach: In order to gain more actual research, combining adversarial risk analysis, we explore the pricing strategy with remanufacturing based on Stackelberg model. Especially, we build OEM’s 1-order ARA model and further study on manufacturers and remanufacturers’ pricing strategy. Findings: We find the OEM’s 1-order ARA model for the OEM’s product cost C. Besides, we get according manufacturers and remanufacturers’ pricing strategies. Besides, the pricing strategies based on 1-order ARA model have advantage over than the classical model regardless of OEMs and remanufacturers. Research implications: The research on application of ARA imply that we can get more actual results with this kind of modern risk analysis method and ARA can be extensively in pricing strategies of supply chain. Value: Our research improves the application of ARA in remanufacturing industry. Meanwhile, inspired by this analysis, we can also create different ARA models for different parameters. Furthermore, some results and analysis methods can be applied to other pricing strategies of supply chain.

  15. Multi-level decision making models, methods and applications

    CERN Document Server

    Zhang, Guangquan; Gao, Ya

    2015-01-01

    This monograph presents new developments in multi-level decision-making theory, technique and method in both modeling and solution issues. It especially presents how a decision support system can support managers in reaching a solution to a multi-level decision problem in practice. This monograph combines decision theories, methods, algorithms and applications effectively. It discusses in detail the models and solution algorithms of each issue of bi-level and tri-level decision-making, such as multi-leaders, multi-followers, multi-objectives, rule-set-based, and fuzzy parameters. Potential readers include organizational managers and practicing professionals, who can use the methods and software provided to solve their real decision problems; PhD students and researchers in the areas of bi-level and multi-level decision-making and decision support systems; students at an advanced undergraduate, master’s level in information systems, business administration, or the application of computer science.  

  16. Variance-based sensitivity indices for models with dependent inputs

    International Nuclear Information System (INIS)

    Mara, Thierry A.; Tarantola, Stefano

    2012-01-01

    Computational models are intensively used in engineering for risk analysis or prediction of future outcomes. Uncertainty and sensitivity analyses are of great help in these purposes. Although several methods exist to perform variance-based sensitivity analysis of model output with independent inputs only a few are proposed in the literature in the case of dependent inputs. This is explained by the fact that the theoretical framework for the independent case is set and a univocal set of variance-based sensitivity indices is defined. In the present work, we propose a set of variance-based sensitivity indices to perform sensitivity analysis of models with dependent inputs. These measures allow us to distinguish between the mutual dependent contribution and the independent contribution of an input to the model response variance. Their definition relies on a specific orthogonalisation of the inputs and ANOVA-representations of the model output. In the applications, we show the interest of the new sensitivity indices for model simplification setting. - Highlights: ► Uncertainty and sensitivity analyses are of great help in engineering. ► Several methods exist to perform variance-based sensitivity analysis of model output with independent inputs. ► We define a set of variance-based sensitivity indices for models with dependent inputs. ► Inputs mutual contributions are distinguished from their independent contributions. ► Analytical and computational tests are performed and discussed.

  17. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    Science.gov (United States)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  18. Application of computer-aided multi-scale modelling framework – Aerosol case study

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Glarborg, Peter

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy and water. This trend is set to continue due to the substantial benefits computer-aided...... methods provide. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task involving...... numerous steps, expert skills and different modelling tools. This motivates the development of a computer-aided modelling framework that supports the user during model development, documentation, analysis, identification, application and re-use with the goal to increase the efficiency of the modelling...

  19. A critical view on temperature modelling for application in weather derivatives markets

    International Nuclear Information System (INIS)

    Šaltytė Benth, Jūratė; Benth, Fred Espen

    2012-01-01

    In this paper we present a stochastic model for daily average temperature. The model contains seasonality, a low-order autoregressive component and a variance describing the heteroskedastic residuals. The model is estimated on daily average temperature records from Stockholm (Sweden). By comparing the proposed model with the popular model of Campbell and Diebold (2005), we point out some important issues to be addressed when modelling the temperature for application in weather derivatives market. - Highlights: ► We present a stochastic model for daily average temperature, containing seasonality, a low-order autoregressive component and a variance describing the heteroskedastic residuals. ► We compare the proposed model with the popular model of Campbell and Diebold (2005). ► Some important issues to be addressed when modelling the temperature for application in weather derivatives market are pointed out.

  20. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2013-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel's MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  1. Performance modeling of hybrid MPI/OpenMP scientific applications on large-scale multicore supercomputers

    KAUST Repository

    Wu, Xingfu

    2013-12-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore supercomputers: IBM POWER4, POWER5+ and BlueGene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks and Intel\\'s MPI benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore supercomputers because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyrokinetic Toroidal Code (GTC) in magnetic fusion to validate our performance model of the hybrid application on these multicore supercomputers. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore supercomputers. © 2013 Elsevier Inc.

  2. APPLICATION OF KMV MODEL TO ASSESS CREDIT RISK OF INDIVIDUAL ENTREPRENEURS

    Directory of Open Access Journals (Sweden)

    Taishin A. A.

    2014-09-01

    Full Text Available The problem of credit risk is relevant for the bank. The purpose of scientific research - to develop a technique of adaptation and application of the model for the evaluation risk of KMV Russian entrepreneurs. The proposed method of evaluation credit risk of KMV Russian entrepreneurs has many advantages. Automation of calculations, based on plausible assumptions, will significantly reduce the time to process the customer's request. The article contains analysis of the KMV model based on the up-to-date results of the theory. The author investigates the possibility of modification, generalization of the model and practical implementation of the risk estimate of default entrepreneur KMV model using software package Visual Basic for Application on the example Management reporting of the entrepreneur. Showing the features of its application in the light of the modern achievements in the theory and practice of financial analysis. In this article suggested the finished result of evaluation risk of KMV Russian entrepreneurs, for risk assessment offered more precise recommendations for the practical use of KMV as a basic tool.

  3. Modelling a New Product Model on the Basis of an Existing STEP Application Protocol

    Directory of Open Access Journals (Sweden)

    B.-R. Hoehn

    2005-01-01

    Full Text Available During the last years a great range of computer aided tools has been generated to support the development process of various products. The goal of a continuous data flow, needed for high efficiency, requires powerful standards for the data exchange. At the FZG (Gear Research Centre of the Technical University of Munich there was a need for a common gear data format for data exchange between gear calculation programs. The STEP standard ISO 10303 was developed for this type of purpose, but a suitable definition of gear data was still missing, even in the Application Protocol AP 214, developed for the design process in the automotive industry. The creation of a new STEP Application Protocol or the extension of existing protocol would be a very time consumpting normative process. So a new method was introduced by FZG. Some very general definitions of an Application Protocol (here AP 214 were used to determine rules for an exact specification of the required kind of data. In this case a product model for gear units was defined based on elements of the AP 214. Therefore no change of the Application Protocol is necessary. Meanwhile the product model for gear units has been published as a VDMA paper and successfully introduced for data exchange within the German gear industry associated with FVA (German Research Organisation for Gears and Transmissions. This method can also be adopted for other applications not yet sufficiently defined by STEP. 

  4. A comparative study of inelastic scattering models at energy levels ranging from 0.5 keV to 10 keV

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Chia-Yu [Department of Photonics, National Cheng Kung University, Tainan 701, Taiwan (China); Lin, Chun-Hung, E-mail: chlin@mail.ncku.edu.tw [Department of Photonics, National Cheng Kung University, Tainan 701, Taiwan (China); Advanced Optoelectronic Technology Center, National Cheng Kung University, Tainan 701, Taiwan (China)

    2017-03-01

    Six models, including a single-scattering model, four hybrid models, and one dielectric function model, were evaluated using Monte Carlo simulations for aluminum and copper at incident beam energies ranging from 0.5 keV to 10 keV. The inelastic mean free path, mean energy loss per unit path length, and backscattering coefficients obtained by these models are compared and discussed to understand the merits of the various models. ANOVA (analysis of variance) statistical models were used to quantify the effects of inelastic cross section and energy loss models on the basis of the simulated results deviation from the experimental data for the inelastic mean free path, the mean energy loss per unit path length, and the backscattering coefficient, as well as their correlations. This work in this study is believed to be the first application of ANOVA models towards evaluating inelastic electron beam scattering models. This approach is an improvement over the traditional approach which involves only visual estimation of the difference between the experimental data and simulated results. The data suggests that the optimization of the effective electron number per atom, binding energy, and cut-off energy of an inelastic model for different materials at different beam energies is more important than the selection of inelastic models for Monte Carlo electron scattering simulation. During the simulations, parameters in the equations should be tuned according to different materials for different beam energies rather than merely employing default parameters for an arbitrary material. Energy loss models and cross-section formulas are not the main factors influencing energy loss. Comparison of the deviation of the simulated results from the experimental data shows a significant correlation (p < 0.05) between the backscattering coefficient and energy loss per unit path length. The inclusion of backscattering electrons generated by both primary and secondary electrons for

  5. Methodology and application of combined watershed and ground-water models in Kansas

    Science.gov (United States)

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling

  6. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  7. A review of model applications for structured soils: b) Pesticide transport.

    Science.gov (United States)

    Köhne, John Maximilian; Köhne, Sigrid; Simůnek, Jirka

    2009-02-16

    The past decade has seen considerable progress in the development of models simulating pesticide transport in structured soils subject to preferential flow (PF). Most PF pesticide transport models are based on the two-region concept and usually assume one (vertical) dimensional flow and transport. Stochastic parameter sets are sometimes used to account for the effects of spatial variability at the field scale. In the past decade, PF pesticide models were also coupled with Geographical Information Systems (GIS) and groundwater flow models for application at the catchment and larger regional scales. A review of PF pesticide model applications reveals that the principal difficulty of their application is still the appropriate parameterization of PF and pesticide processes. Experimental solution strategies involve improving measurement techniques and experimental designs. Model strategies aim at enhancing process descriptions, studying parameter sensitivity, uncertainty, inverse parameter identification, model calibration, and effects of spatial variability, as well as generating model emulators and databases. Model comparison studies demonstrated that, after calibration, PF pesticide models clearly outperform chromatographic models for structured soils. Considering nonlinear and kinetic sorption reactions further enhanced the pesticide transport description. However, inverse techniques combined with typically available experimental data are often limited in their ability to simultaneously identify parameters for describing PF, sorption, degradation and other processes. On the other hand, the predictive capacity of uncalibrated PF pesticide models currently allows at best an approximate (order-of-magnitude) estimation of concentrations. Moreover, models should target the entire soil-plant-atmosphere system, including often neglected above-ground processes such as pesticide volatilization, interception, sorption to plant residues, root uptake, and losses by runoff. The

  8. Application of Markovian model to school enrolment projection ...

    African Journals Online (AJOL)

    Application of Markovian model to school enrolment projection process. VU Ekhosuehi, AA Osagiede. Abstract. No Abstract. Global Journal of Mathematical Sciences Vol. 5(1) 2006: 9-16. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  9. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  10. Business model driven service architecture design for enterprise application integration

    OpenAIRE

    Gacitua-Decar, Veronica; Pahl, Claus

    2008-01-01

    Increasingly, organisations are using a Service-Oriented Architecture (SOA) as an approach to Enterprise Application Integration (EAI), which is required for the automation of business processes. This paper presents an architecture development process which guides the transition from business models to a service-based software architecture. The process is supported by business reference models and patterns. Firstly, the business process models are enhanced with domain model elements, applicat...

  11. Virtual 3d City Modeling: Techniques and Applications

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3

  12. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel

    Science.gov (United States)

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G.; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed. PMID:27252672

  13. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel.

    Science.gov (United States)

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed.

  14. Digital terrain modelling development and applications in a policy support environment

    CERN Document Server

    Peckham, Robert Joseph

    2007-01-01

    This publication is the first book on the development and application of digital terrain modelling for regional planning and policy support. It is a compilation of research results by international research groups at the European Commission's Joint Research Centre providing scientific support to the development and implementation of EU environmental policy. Applications include the pan-European River and Catchment Database, European Flood Alert System, European Digital Soil Database and alternative solar energy resources, all discussed in a GIS framework in the context of the INfrastructure for SPatial InfoRmation in Europe (INSPIRE). This practice-oriented book is recommended to practicing environmental modellers and GIS experts working on regional planning and policy support applications.

  15. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  16. Constitutive Modeling of Geomaterials Advances and New Applications

    CERN Document Server

    Zhang, Jian-Min; Zheng, Hong; Yao, Yangping

    2013-01-01

    The Second International Symposium on Constitutive Modeling of Geomaterials: Advances and New Applications (IS-Model 2012), is to be held in Beijing, China, during October 15-16, 2012. The symposium is organized by Tsinghua University, the International Association for Computer Methods and Advances in Geomechanics (IACMAG), the Committee of Numerical and Physical Modeling of Rock Mass, Chinese Society for Rock Mechanics and Engineering, and the Committee of Constitutive Relations and Strength Theory, China Institution of Soil Mechanics and Geotechnical Engineering, China Civil Engineering Society. This Symposium follows the first successful International Workshop on Constitutive Modeling held in Hong Kong, which was organized by Prof. JH Yin in 2007.   Constitutive modeling of geomaterials has been an active research area for a long period of time. Different approaches have been used in the development of various constitutive models. A number of models have been implemented in the numerical analyses of geote...

  17. Application of model based control to robotic manipulators

    Science.gov (United States)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  18. Modelling of Electrokinetic Processes in Civil and Environmental Engineering Applications

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2011-01-01

    conditions are assumed between the aqueous species and the solid matrix for a set of feasible chemical equilibrium reactions defined for each specific application. A module for re-establishing the chemical equilibrium has been developed and included in the system for this purpose. Changes in the porosity......A mathematical model for the electrokinetic phenomena is described. Numerical simulations of different applications of electrokinetic techniques to the fields of civil and environmental engineering are included, showing the versatility and consistency of the model. The electrokinetics phenomena......-Nernst-Planck system of equations, accounting for ionic migration, chemical diffusion and advection is used for modeling the transport process. The advection term contributor is studied by including in the system the water transport through the porous media, mainly due to electroosmosis. The pore solution filling...

  19. Geometric subspace updates with applications to online adaptive nonlinear model reduction

    DEFF Research Database (Denmark)

    Zimmermann, Ralf; Peherstorfer, Benjamin; Willcox, Karen

    2018-01-01

    In many scientific applications, including model reduction and image processing, subspaces are used as ansatz spaces for the low-dimensional approximation and reconstruction of the state vectors of interest. We introduce a procedure for adapting an existing subspace based on information from...... Estimation (GROUSE). We establish for GROUSE a closed-form expression for the residual function along the geodesic descent direction. Specific applications of subspace adaptation are discussed in the context of image processing and model reduction of nonlinear partial differential equation systems....

  20. Polynomial model inversion control: numerical tests and applications

    OpenAIRE

    Novara, Carlo

    2015-01-01

    A novel control design approach for general nonlinear systems is described in this paper. The approach is based on the identification of a polynomial model of the system to control and on the on-line inversion of this model. Extensive simulations are carried out to test the numerical efficiency of the approach. Numerical examples of applicative interest are presented, concerned with control of the Duffing oscillator, control of a robot manipulator and insulin regulation in a type 1 diabetic p...

  1. THE EFFECTS OF COOPERATIVE LEARNING MODEL GROUP INVESTIGATION AND MOTIVATION TOWARD PHYSICS LEARNING RESULTS MAN TANJUNGBALAI

    Directory of Open Access Journals (Sweden)

    Amalia Febri Aristi

    2014-12-01

    Full Text Available This study aimed to determine: (1 Is there a difference in student's learning outcomes with the application of learning models Investigation Group and Direct Instruction teaching model. (2 Is there a difference in students' motivation with the application of learning models Investigation Group and Direct Instruction teaching model, (3 Is there an interaction between learning models Investigation Group and Direct Instruction to improve students' motivation in learning outcomes Physics. This research is a quasi experimental. The study population was a student of class XII Tanjung Balai MAN. Random sample selection is done by randomizing the class. The instrument used consisted of: (1 achievement test (2 students' motivation questionnaire. The tests are used to obtain the data is shaped essay. The data in this study were analyzed using ANOVA analysis of two paths. The results showed that: (1 there were differences in learning outcomes between students who used the physics model of Group Investigation learning compared with students who used the Direct Instruction teaching model. (2 There was a difference in student's learning outcomes that had a low learning motivation and high motivation to learn both in the classroom and in the classroom Investigation Group Direct Instruction. (3 There was interaction between learning models Instruction Direct Group Investigation and motivation to learn in improving learning outcomes Physics.

  2. Recent Advances in Material and Geometrical Modelling in Dental Applications

    Directory of Open Access Journals (Sweden)

    Waleed M. S. Al Qahtani

    2018-06-01

    Full Text Available This article touched, in brief, the recent advances in dental materials and geometric modelling in dental applications. Most common categories of dental materials as metallic alloys, composites, ceramics and nanomaterials were briefly demonstrated. Nanotechnology improved the quality of dental biomaterials. This new technology improves many existing materials properties, also, to introduce new materials with superior properties that covered a wide range of applications in dentistry. Geometric modelling was discussed as a concept and examples within this article. The geometric modelling with engineering Computer-Aided-Design (CAD system(s is highly satisfactory for further analysis or Computer-Aided-Manufacturing (CAM processes. The geometric modelling extracted from Computed-Tomography (CT images (or its similar techniques for the sake of CAM also reached a sufficient level of accuracy, while, obtaining efficient solid modelling without huge efforts on body surfaces, faces, and gaps healing is still doubtable. This article is merely a compilation of knowledge learned from lectures, workshops, books, and journal articles, articles from the internet, dental forum, and scientific groups' discussions.

  3. Modelling and simulation of diffusive processes methods and applications

    CERN Document Server

    Basu, SK

    2014-01-01

    This book addresses the key issues in the modeling and simulation of diffusive processes from a wide spectrum of different applications across a broad range of disciplines. Features: discusses diffusion and molecular transport in living cells and suspended sediment in open channels; examines the modeling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modeling of nitrogen fate and transport

  4. A conceptual holding model for veterinary applications

    Directory of Open Access Journals (Sweden)

    Nicola Ferrè

    2014-05-01

    Full Text Available Spatial references are required when geographical information systems (GIS are used for the collection, storage and management of data. In the veterinary domain, the spatial component of a holding (of animals is usually defined by coordinates, and no other relevant information needs to be interpreted or used for manipulation of the data in the GIS environment provided. Users trying to integrate or reuse spatial data organised in such a way, frequently face the problem of data incompatibility and inconsistency. The root of the problem lies in differences with respect to syntax as well as variations in the semantic, spatial and temporal representations of the geographic features. To overcome these problems and to facilitate the inter-operability of different GIS, spatial data must be defined according to a “schema” that includes the definition, acquisition, analysis, access, presentation and transfer of such data between different users and systems. We propose an application “schema” of holdings for GIS applications in the veterinary domain according to the European directive framework (directive 2007/2/EC - INSPIRE. The conceptual model put forward has been developed at two specific levels to produce the essential and the abstract model, respectively. The former establishes the conceptual linkage of the system design to the real world, while the latter describes how the system or software works. The result is an application “schema” that formalises and unifies the information-theoretic foundations of how to spatially represent a holding in order to ensure straightforward information-sharing within the veterinary community.

  5. An application of artificial intelligence for rainfall–runoff modeling

    Indian Academy of Sciences (India)

    This study proposes an application of two techniques of artificial intelligence (AI) ... (2006) applied rainfall–runoff modeling using ANN ... in artificial intelligence, engineering and science .... usually be estimated from a sample of observations.

  6. The social networking application success model : An empirical study of Facebook and Twitter

    NARCIS (Netherlands)

    Ou, Carol; Davison, R.M.; Huang, Q.

    2016-01-01

    Social networking applications (SNAs) are among the fastest growing web applications of recent years. In this paper, we propose a causal model to assess the success of SNAs, grounded on DeLone and McLean’s updated information systems (IS) success model. In addition to their original three dimensions

  7. Surrogate Model for Recirculation Phase LBLOCA and DET Application

    International Nuclear Information System (INIS)

    Fynan, Douglas A; Ahn, Kwang-Il; Lee, John C.

    2014-01-01

    In the nuclear safety field, response surfaces were used in the first demonstration of the code scaling, applicability, and uncertainty (CSAU) methodology to quantify the uncertainty of the peak clad temperature (PCT) during a large-break loss-of-coolant accident (LBLOCA). Surrogates could have applications in other nuclear safety areas such as dynamic probabilistic safety assessment (PSA). Dynamic PSA attempts to couple the probabilistic nature of failure events, component transitions, and human reliability to deterministic calculations of time-dependent nuclear power plant (NPP) responses usually through the use of thermal-hydraulic (TH) system codes. The overall mathematical complexity of the dynamic PSA architectures with many embedded computational expensive TH code calculations with large input/output data streams have limited realistic studies of NPPs. This paper presents a time-dependent surrogate model for the recirculation phase of a hot leg LBLOCA in the OPR-1000. The surrogate model is developed through the ACE algorithm, a powerful nonparametric regression technique, trained on RELAP5 simulations of the LBLOCA. Benchmarking of the surrogate is presented and an application to a simplified dynamic event tree (DET). A time-dependent surrogate model to predict core subcooling during the recirculation phase of a hot leg LBLOCA in the OPR-1000 has been developed. The surrogate assumed the structure of a general discrete time dynamic model and learned the nonlinear functional form by performing nonparametric regression on RELAP5 simulations with the ACE algorithm. The surrogate model input parameters represent mass and energy flux terms to the RCS that appeared as user supplied or code calculated boundary conditions in the RELAP5 model. The surrogate accurately predicted the TH behavior of the core for a variety of HPSI system performance and containment conditions when compared with RELAP5 simulations. The surrogate was applied in a DET application replacing

  8. Temperature modulated differential scanning calorimetry. Modelling and applications

    International Nuclear Information System (INIS)

    Jiang, Z.

    2000-01-01

    The research focused on the TMDSC technique with respect to both theoretical problems and applications. The modelling has been performed to address the effects of heat transfer on the quantitative measurement of TMDSC experiments. A procedure has been suggested to correct the effect on the phase angle obtained by dynamic TMDSC. The effects under quasi-isothermal conditions have been investigated using improved models in terms of various heat transfer interface qualities, sample properties and sensor properties. The contributions of the sensor's properties to the heat transfer are, for the first time, separated from the overall effects. All the modelling results are compared with the corresponding experimental data and they are in good agreement. Ripples and fluctuations on the experimental signals during some transitions have been simulated using a simple model, and then, been shown to be artefacts of the Fourier transformation process. The applications of TMDSC to both research and commercial samples are reported in terms of differing either the experimental conditions or the thermal history of the sample for the studies of the glass transition, cold crystallisation, the melting transition, the clearing transition of a liquid crystal polymer, and the vitrification of an epoxy resin under quasi-isothermal conditions. The results show that the interpretations of some quantities obtained by TMDSC to some physical transitions still need to be clarified by further work. The applications also show the abilities of TMDSC for combining the sensitivity of a measurement at high instantaneous heating with the resolution obtained by measuring at a low underlying heating and measuring the heat capacity of the sample and its variation under the quasi-isothermal conditions. The frequency dependent complex heat capacity during the glass transition provides a window to measure the apparent activation energy of the transition, which is different from the window used by conventional

  9. A Contrast-Based Computational Model of Surprise and Its Applications.

    Science.gov (United States)

    Macedo, Luis; Cardoso, Amílcar

    2017-11-19

    We review our work on a contrast-based computational model of surprise and its applications. The review is contextualized within related research from psychology, philosophy, and particularly artificial intelligence. Influenced by psychological theories of surprise, the model assumes that surprise-eliciting events initiate a series of cognitive processes that begin with the appraisal of the event as unexpected, continue with the interruption of ongoing activity and the focusing of attention on the unexpected event, and culminate in the analysis and evaluation of the event and the revision of beliefs. It is assumed that the intensity of surprise elicited by an event is a nonlinear function of the difference or contrast between the subjective probability of the event and that of the most probable alternative event (which is usually the expected event); and that the agent's behavior is partly controlled by actual and anticipated surprise. We describe applications of artificial agents that incorporate the proposed surprise model in three domains: the exploration of unknown environments, creativity, and intelligent transportation systems. These applications demonstrate the importance of surprise for decision making, active learning, creative reasoning, and selective attention. Copyright © 2017 Cognitive Science Society, Inc.

  10. On the apllication of single specie dynamic population model | Iguda ...

    African Journals Online (AJOL)

    The Method of mathematical models of Malthus and Verhults were applied on ten years data collected from Magaram Poultry Farm to determine the nature of population growth, population decay or constant ... Keywords: Birth rate, sustainable population, overcrowding, harvesting, independent t-test and one way Anova.

  11. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  12. Hydromechanical modelling with application in sealing for underground waste deposition

    Energy Technology Data Exchange (ETDEWEB)

    Hasal, Martin, E-mail: martin.hasal@vsb.cz; Michalec, Zdeněk; Blaheta, Radim [Institute of Geonics AS CR, Studentska 1768, 70800 Ostrava-Poruba (Czech Republic)

    2015-03-10

    Hydro-mechanical models appear in simulation of many environmental problems related to construction of engineering barriers for contaminant spreading. The presented work aims in modelling bentonite-sand barriers, which can be used for nuclear waste isolation and similar problems. Particularly, we use hydro-mechanical model coupling unsaturated flow and (nonlinear) elasticity, implement such model in COMSOL software and show application in simulation of an infiltration test (2D axisymmetric model) and the SEALEX Water test WT1 experiment (3D model). Finally, we discuss the needs and possibilities of parallel high performance computing.

  13. Models of the Organizational Life Cycle: Applications to Higher Education.

    Science.gov (United States)

    Cameron, Kim S.; Whetten, David A.

    1983-01-01

    A review of models of group and organization life cycle development is provided and the applicability of those models for institutions of higher education are discussed. An understanding of the problems and characteristics present in different life cycle stages can help institutions manage transitions more effectively. (Author/MLW)

  14. SWAT Check: A Screening Tool to Assist Users in the Identification of Potential Model Application Problems.

    Science.gov (United States)

    White, Michael J; Harmel, R Daren; Arnold, Jeff G; Williams, Jimmy R

    2014-01-01

    The Soil and Water Assessment Tool (SWAT) is a basin-scale hydrologic model developed by the United States Department of Agriculture Agricultural Research Service. SWAT's broad applicability, user-friendly model interfaces, and automatic calibration software have led to a rapid increase in the number of new users. These advancements also allow less experienced users to conduct SWAT modeling applications. In particular, the use of automated calibration software may produce simulated values that appear appropriate because they adequately mimic measured data used in calibration and validation. Autocalibrated model applications (and often those of unexperienced modelers) may contain input data errors and inappropriate parameter adjustments not readily identified by users or the autocalibration software. The objective of this research was to develop a program to assist users in the identification of potential model application problems. The resulting "SWAT Check" is a stand-alone Microsoft Windows program that (i) reads selected SWAT output and alerts users of values outside the typical range; (ii) creates process-based figures for visualization of the appropriateness of output values, including important outputs that are commonly ignored; and (iii) detects and alerts users of common model application errors. By alerting users to potential model application problems, this software should assist the SWAT community in developing more reliable modeling applications. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  15. Chemical Equilibrium Modeling of Hanford Waste Tank Processing: Applications of Fundamental Science

    International Nuclear Information System (INIS)

    Felmy, Andrew R.; Wang, Zheming; Dixon, David A.; Hess, Nancy J.

    2004-01-01

    The development of computational models based upon fundamental science is one means of quantitatively transferring the results of scientific investigations to practical application by engineers in laboratory and field situations. This manuscript describes one example of such efforts, specifically the development and application of chemical equilibrium models to different waste management issues at the U.S. Department of Energy (DOE) Hanford Site. The development of the chemical models is described with an emphasis on the fundamental science investigations that have been undertaken in model development followed by examples of different waste management applications. The waste management issues include the leaching of waste slurries to selective remove non-hazardous components and the separation of Sr90 and transuranics from the waste supernatants. The fundamental science contributions include: molecular simulations of the energetics of different molecular clusters to assist in determining the species present in solution, advanced synchrotron research to determine the chemical form of precipitates, and laser based spectroscopic studies of solutions and solids.

  16. Electromagnetic Modelling of MMIC CPWs for High Frequency Applications

    Science.gov (United States)

    Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.

    2018-02-01

    Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.

  17. Modelling of a Hybrid Energy System for Autonomous Application

    Directory of Open Access Journals (Sweden)

    Yang He

    2013-10-01

    Full Text Available A hybrid energy system (HES is a trending power supply solution for autonomous devices. With the help of an accurate system model, the HES development will be efficient and oriented. In spite of various precise unit models, a HES system is hardly developed. This paper proposes a system modelling approach, which applies the power flux conservation as the governing equation and adapts and modifies unit models of solar cells, piezoelectric generators, a Li-ion battery and a super-capacitor. A generalized power harvest, storage and management strategy is also suggested to adapt to various application scenarios.

  18. Application of the Technology Acceptance Model (TAM) in electronic ...

    African Journals Online (AJOL)

    Application of the Technology Acceptance Model (TAM) in electronic ticket purchase for ... current study examined the perceived usefulness and ease of use of online technology ... The findings are discussed in the light of these perspectives.

  19. Application of snowmelt runoff model (SRM in mountainous watersheds: A review

    Directory of Open Access Journals (Sweden)

    Shalamu Abudu

    2012-06-01

    Full Text Available The snowmelt runoff model (SRM has been widely used in simulation and forecast of streamflow in snow-dominated mountainous basins around the world. This paper presents an overall review of worldwide applications of SRM in mountainous watersheds, particularly in data-sparse watersheds of northwestern China. Issues related to proper selection of input climate variables and parameters, and determination of the snow cover area (SCA using remote sensing data in snowmelt runoff modeling are discussed through extensive review of literature. Preliminary applications of SRM in northwestern China have shown that the model accuracies are relatively acceptable although most of the watersheds lack measured hydro-meteorological data. Future research could explore the feasibility of modeling snowmelt runoff in data-sparse mountainous watersheds in northwestern China by utilizing snow and glacier cover remote sensing data, geographic information system (GIS tools, field measurements, and innovative ways of model parameterization.

  20. Challenges of Microgrids in Remote Communities: A STEEP Model Application

    Directory of Open Access Journals (Sweden)

    Daniel Akinyele

    2018-02-01

    Full Text Available There is a growing interest in the application of microgrids around the world because of their potential for achieving a flexible, reliable, efficient and smart electrical grid system and supplying energy to off-grid communities, including their economic benefits. Several research studies have examined the application issues of microgrids. However, a lack of in-depth considerations for the enabling planning conditions has been identified as a major reason why microgrids fail in several off-grid communities. This development requires research efforts that consider better strategies and framework for sustainable microgrids in remote communities. This paper first presents a comprehensive review of microgrid technologies and their applications. It then proposes the STEEP model to examine critically the failure factors based on the social, technical, economic, environmental and policy (STEEP perspectives. The model details the key dimensions and actions necessary for addressing the challenge of microgrid failure in remote communities. The study uses remote communities within Nigeria, West Africa, as case studies and demonstrates the need for the STEEP approach for better understanding of microgrid planning and development. Better insights into microgrid systems are expected to address the drawbacks and improve the situation that can lead to widespread and sustainable applications in off-grid communities around the world in the future. The paper introduces the sustainable planning framework (SPF based on the STEEP model, which can form a general basis for planning microgrids in any remote location.

  1. Application of Prognostic Mesoscale Modeling in the Southeast United States

    International Nuclear Information System (INIS)

    Buckley, R.L.

    1999-01-01

    A prognostic model is being used to provide regional forecasts for a variety of applications at the Savannah River Site (SRS). Emergency response dispersion models available at SRS use the space and time-dependent meteorological data provided by this model to supplement local and regional observations. Output from the model is also used locally to aid in forecasting at SRS, and regionally in providing forecasts of the potential time and location of hurricane landfall within the southeast United States

  2. Fault Tolerance Assistant (FTA): An Exception Handling Programming Model for MPI Applications

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Aiman [Univ. of Chicago, IL (United States). Dept. of Computer Science; Laguna, Ignacio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sato, Kento [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Islam, Tanzima [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-05-23

    Future high-performance computing systems may face frequent failures with their rapid increase in scale and complexity. Resilience to faults has become a major challenge for large-scale applications running on supercomputers, which demands fault tolerance support for prevalent MPI applications. Among failure scenarios, process failures are one of the most severe issues as they usually lead to termination of applications. However, the widely used MPI implementations do not provide mechanisms for fault tolerance. We propose FTA-MPI (Fault Tolerance Assistant MPI), a programming model that provides support for failure detection, failure notification and recovery. Specifically, FTA-MPI exploits a try/catch model that enables failure localization and transparent recovery of process failures in MPI applications. We demonstrate FTA-MPI with synthetic applications and a molecular dynamics code CoMD, and show that FTA-MPI provides high programmability for users and enables convenient and flexible recovery of process failures.

  3. Counseling Model Application: A Student Career Development Guidance for Decision Maker and Consultation

    Science.gov (United States)

    Irwan; Gustientiedina; Sunarti; Desnelita, Yenny

    2017-12-01

    The purpose of this study is to design a counseling model application for a decision-maker and consultation system. This application as an alternative guidance and individual career development for students, that include career knowledge, planning and alternative options from an expert tool based on knowledge and rule to provide the solutions on student’s career decisions. This research produces a counseling model application to obtain the important information about student career development and facilitating individual student’s development through the service form, to connect their plan with their career according to their talent, interest, ability, knowledge, personality and other supporting factors. This application model can be used as tool to get information faster and flexible for the student’s guidance and counseling. So, it can help students in doing selection and making decision that appropriate with their choice of works.

  4. Applications of Nonlinear Dynamics Model and Design of Complex Systems

    CERN Document Server

    In, Visarath; Palacios, Antonio

    2009-01-01

    This edited book is aimed at interdisciplinary, device-oriented, applications of nonlinear science theory and methods in complex systems. In particular, applications directed to nonlinear phenomena with space and time characteristics. Examples include: complex networks of magnetic sensor systems, coupled nano-mechanical oscillators, nano-detectors, microscale devices, stochastic resonance in multi-dimensional chaotic systems, biosensors, and stochastic signal quantization. "applications of nonlinear dynamics: model and design of complex systems" brings together the work of scientists and engineers that are applying ideas and methods from nonlinear dynamics to design and fabricate complex systems.

  5. Econometric Models of Education, Some Applications. Education and Development, Technical Reports.

    Science.gov (United States)

    Tinbergen, Jan; And Others

    This report contains five papers which describe mathematical models of the educational system as it relates to economic growth. Experimental applications of the models to particular educational systems are discussed. Three papers, by L. J. Emmerij, J. Blum, and G. Williams, discuss planning models for the calculation of educational requirements…

  6. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  7. On the limits of application of the Stephens model

    International Nuclear Information System (INIS)

    Issa, A.; Piepenbring, R.

    1977-01-01

    The limits of the rotation alignment model of Stephens are studied. The conditions of applicability of the assumption of constant j for a unique parity isolated sub-shell (extended to N=4 and N=3) are discussed and explanations are given. A correct treatment of the eigenstates of the intrinsic motion allows however a simple extension of the model to non-isolated sub-shells without enlarging the basis of diagonalisation [fr

  8. Addressing challenges in obtaining high coverage when model checking android applications

    CSIR Research Space (South Africa)

    Botha, Heila-Marie

    2017-07-01

    Full Text Available -state-comparator and can be excluded from the state using JPF’s @FilterField annotation or other con€guration options discussed in [5]. JPF-Android further optimizes state-matching by pre-loading all application classes and application speci€c models. It also rede...

  9. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  10. Functional Modelling for Fault Diagnosis and its application for NPP

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper presents functional modelling and its application for diagnosis in nuclear power plants.Functional modelling is defined and it is relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demon...... operating modes. The FBR example illustrates how the modeling development effort can be managed by proper strategies including decomposition and reuse....

  11. THE APPLICATION OF THE CAPITAL ASSET PRICING MODEL ON THE CROATIAN CAPITAL MARKET

    Directory of Open Access Journals (Sweden)

    Bojan Tomic

    2013-12-01

    Full Text Available The paper describes and analyzes the application of the capital asset pricing model (CAPM and the single-index model on the Zagreb stock exchange during the drop in the total trade turnover, and mostly in the trade of equity securities. This model shows through the analysis techniques used to estimate the systematic risk per share compared to the market portfolio. Also, the model quantifies the environment in which a company and its stocks exist, expressing it as risk, or a beta coefficient. Furthermore, with respect to the market stagnation, one can also discuss the usefulness of the model, especially if the quality of the input data is questionable. In this regard, the importance of the proper application and interpretation of the results obtained based on the model during the stagnation of the market, and especially during the stagnation of the trade of equity securities, is gaining even greater importance and significance. On the other hand, the results obtained through the analysis of data point to problems arising during the application of the model. It turns out the main problem of applying the CAPM model is the market index with negative returns during the observation period.

  12. An Overview of Generalized Gamma Mittag–Leffler Model and Its Applications

    Directory of Open Access Journals (Sweden)

    Seema S. Nair

    2015-08-01

    Full Text Available Recently, probability models with thicker or thinner tails have gained more importance among statisticians and physicists because of their vast applications in random walks, Lévi flights, financial modeling, etc. In this connection, we introduce here a new family of generalized probability distributions associated with the Mittag–Leffler function. This family gives an extension to the generalized gamma family, opens up a vast area of potential applications and establishes connections to the topics of fractional calculus, nonextensive statistical mechanics, Tsallis statistics, superstatistics, the Mittag–Leffler stochastic process, the Lévi process and time series. Apart from examining the properties, the matrix-variate analogue and the connection to fractional calculus are also explained. By using the pathway model of Mathai, the model is further generalized. Connections to Mittag–Leffler distributions and corresponding autoregressive processes are also discussed.

  13. A review on application of finite element modelling in bone biomechanics

    Directory of Open Access Journals (Sweden)

    Sandeep Kumar Parashar

    2016-09-01

    Full Text Available In the past few decades the finite element modelling has been developed as an effective tool for modelling and simulation of the biomedical engineering system. Finite element modelling (FEM is a computational technique which can be used to solve the biomedical engineering problems based on the theories of continuum mechanics. This paper presents the state of art review on finite element modelling application in the four areas of bone biomechanics, i.e., analysis of stress and strain, determination of mechanical properties, fracture fixation design (implants, and fracture load prediction. The aim of this review is to provide a comprehensive detail about the development in the area of application of FEM in bone biomechanics during the last decades. It will help the researchers and the clinicians alike for the better treatment of patients and future development of new fixation designs.

  14. Bifurcation software in Matlab with applications in neuronal modeling.

    Science.gov (United States)

    Govaerts, Willy; Sautois, Bart

    2005-02-01

    Many biological phenomena, notably in neuroscience, can be modeled by dynamical systems. We describe a recent improvement of a Matlab software package for dynamical systems with applications to modeling single neurons and all-to-all connected networks of neurons. The new software features consist of an object-oriented approach to bifurcation computations and the partial inclusion of C-code to speed up the computation. As an application, we study the origin of the spiking behaviour of neurons when the equilibrium state is destabilized by an incoming current. We show that Class II behaviour, i.e. firing with a finite frequency, is possible even if the destabilization occurs through a saddle-node bifurcation. Furthermore, we show that synchronization of an all-to-all connected network of such neurons with only excitatory connections is also possible in this case.

  15. Model of students’ sport-oriented physical education with application of information technologies

    Directory of Open Access Journals (Sweden)

    O.M. Olkhovy

    2015-06-01

    Full Text Available Purpose: working out and practical application of approaches to perfection of physical education system’s functioning. Material: in the research students (boys- n=92, girls- n=45 of 18-20 years old took part. Results: structural model of students’ sport-oriented physical education with application of information technologies has been formed. The main purpose of such model’s creation was cultivation of students’ demand in physical functioning and formation of healthy life style in students’ environment. The model of the process includes orienting, executive and control components. In this model groups of commonly accepted physical education and sport-oriented groups function. Conclusions: Main structural components of the created model have been determined: conceptual, motivation-active, resulting.

  16. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu; Taylor, Valerie

    2011-01-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  17. Performance Modeling of Hybrid MPI/OpenMP Scientific Applications on Large-scale Multicore Cluster Systems

    KAUST Repository

    Wu, Xingfu

    2011-08-01

    In this paper, we present a performance modeling framework based on memory bandwidth contention time and a parameterized communication model to predict the performance of OpenMP, MPI and hybrid applications with weak scaling on three large-scale multicore clusters: IBM POWER4, POWER5+ and Blue Gene/P, and analyze the performance of these MPI, OpenMP and hybrid applications. We use STREAM memory benchmarks to provide initial performance analysis and model validation of MPI and OpenMP applications on these multicore clusters because the measured sustained memory bandwidth can provide insight into the memory bandwidth that a system should sustain on scientific applications with the same amount of workload per core. In addition to using these benchmarks, we also use a weak-scaling hybrid MPI/OpenMP large-scale scientific application: Gyro kinetic Toroidal Code in magnetic fusion to validate our performance model of the hybrid application on these multicore clusters. The validation results for our performance modeling method show less than 7.77% error rate in predicting the performance of hybrid MPI/OpenMP GTC on up to 512 cores on these multicore clusters. © 2011 IEEE.

  18. Radiant heat transfers in turbojet engines. Two applications, three levels of modeling; Transferts radiatifs dans les foyers de turboreacteurs. Deux applications, trois niveaux de modelisation

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, J L; Desaulty, M [SNECMA, Centre de Villaroche, 77 - Moissy-Cramayel (France); Taine, J [Ecole Centrale de Paris, Laboratoire EM2C. CNRS, 92 - Chatenay-Malabry (France)

    1997-12-31

    Several applications linked with the dimensioning of turbojet engines require the use of modeling of radiant heat transfers. Two different applications are presented in this study: the modeling of heat transfers in the main combustion chamber, and modeling of the infrared signature of the post-combustion chamber of a military engine. In the first application, two types of radiant heat transfer modeling are presented: a global modeling based on empirical considerations and used in rapid pre-dimensioning methods, and a modeling based on a grey gases concept and combined to a ray shooting type technique allowing the determination of local radiant heat flux values. In the second application, a specific modeling of the radiant heat flux is used in the framework of a ray shooting method. Each model represents a different level of successive approximations of the radiant heat transfer adapted to flow specificities and to the performance requested. (J.S.) 16 refs.

  19. Radiant heat transfers in turbojet engines. Two applications, three levels of modeling; Transferts radiatifs dans les foyers de turboreacteurs. Deux applications, trois niveaux de modelisation

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, J.L.; Desaulty, M. [SNECMA, Centre de Villaroche, 77 - Moissy-Cramayel (France); Taine, J. [Ecole Centrale de Paris, Laboratoire EM2C. CNRS, 92 - Chatenay-Malabry (France)

    1996-12-31

    Several applications linked with the dimensioning of turbojet engines require the use of modeling of radiant heat transfers. Two different applications are presented in this study: the modeling of heat transfers in the main combustion chamber, and modeling of the infrared signature of the post-combustion chamber of a military engine. In the first application, two types of radiant heat transfer modeling are presented: a global modeling based on empirical considerations and used in rapid pre-dimensioning methods, and a modeling based on a grey gases concept and combined to a ray shooting type technique allowing the determination of local radiant heat flux values. In the second application, a specific modeling of the radiant heat flux is used in the framework of a ray shooting method. Each model represents a different level of successive approximations of the radiant heat transfer adapted to flow specificities and to the performance requested. (J.S.) 16 refs.

  20. Knowledge gobernanza model ah its applications in OTRIS. Two cases

    International Nuclear Information System (INIS)

    Bueno Campos, E.; Plaz Landela, R.; Albert Berenguer, J.

    2007-01-01

    The importance of I+D and knowledge transfer in European economies and in Technology and Innovation in Spain, is a key issue to achieve the Europe target for 2010 to become the European society of knowledge for growth. This article shows, with certain detail, the structure and functions of the MTT model used as a reference of processes needed to fulfil the mission of an OTRI and its function of knowledge transfer.Two concrete applications show the effectiveness and functionality of the model; CARTA and PRISMA applications are case studies of the MTT implementation process. They represent a first step through new developments that are being carried out in other OTRIS. (Author) 35 refs

  1. MOGO: Model-Oriented Global Optimization of Petascale Applications

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D.; Shende, Sameer S.

    2012-09-14

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge, performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.

  2. Application of the rainfall infiltration breakthrough (RIB) model for ...

    African Journals Online (AJOL)

    2012-05-23

    May 23, 2012 ... In this paper, the physical meaning of parameters in the CRD and previous ... ity; the utility of the RIB model for application in different climatic areas under ...... TMG Aquifer feasibility study and pilot project ecological and.

  3. Protein loop modeling using a new hybrid energy function and its application to modeling in inaccurate structural environments.

    Directory of Open Access Journals (Sweden)

    Hahnbeom Park

    Full Text Available Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.

  4. Risk Measurement and Risk Modelling Using Applications of Vine Copulas

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2017-09-01

    Full Text Available This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is usually applied to pairs of securities. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using the rich variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. The paper features the use of Regular Vine copulas in an analysis of the co-dependencies of 10 major European Stock Markets, as represented by individual market indices and the composite STOXX 50 index. The sample runs from 2005 to the end of 2013 to permit an exploration of how correlations change indifferent economic circumstances using three different sample periods: pre-GFC (January 2005–July 2007, GFC (July 2007– September 2009, and post-GFC periods (September 2009–December 2013. The empirical results suggest that the dependencies change in a complex manner, and are subject to change in different economic circumstances. One of the attractions of this approach to risk modelling is the flexibility in the choice of distributions used to model co-dependencies. The practical application of Regular Vine metrics is demonstrated via an example of the calculation of the VaR of a portfolio made up of the indices.

  5. Hidden Markov Model Application to Transfer The Trader Online Forex Brokers

    Directory of Open Access Journals (Sweden)

    Farida Suharleni

    2012-05-01

    Full Text Available Hidden Markov Model is elaboration of Markov chain, which is applicable to cases that can’t directly observe. In this research, Hidden Markov Model is used to know trader’s transition to broker forex online. In Hidden Markov Model, observed state is observable part and hidden state is hidden part. Hidden Markov Model allows modeling system that contains interrelated observed state and hidden state. As observed state in trader’s transition to broker forex online is category 1, category 2, category 3, category 4, category 5 by condition of every broker forex online, whereas as hidden state is broker forex online Marketiva, Masterforex, Instaforex, FBS and Others. First step on application of Hidden Markov Model in this research is making construction model by making a probability of transition matrix (A from every broker forex online. Next step is making a probability of observation matrix (B by making conditional probability of five categories, that is category 1, category 2, category 3, category 4, category 5 by condition of every broker forex online and also need to determine an initial state probability (π from every broker forex online. The last step is using Viterbi algorithm to find hidden state sequences that is broker forex online sequences which is the most possible based on model and observed state that is the five categories. Application of Hidden Markov Model is done by making program with Viterbi algorithm using Delphi 7.0 software with observed state based on simulation data. Example: By the number of observation T = 5 and observed state sequences O = (2,4,3,5,1 is found hidden state sequences which the most possible with observed state O as following : where X1 = FBS, X2 = Masterforex, X3 = Marketiva, X4 = Others, and X5 = Instaforex.

  6. Computer-controlled mechanical lung model for application in pulmonary function studies

    NARCIS (Netherlands)

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  7. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  8. Towards Industrial Application of Damage Models for Sheet Metal Forming

    Science.gov (United States)

    Doig, M.; Roll, K.

    2011-05-01

    Due to global warming and financial situation the demand to reduce the CO2-emission and the production costs leads to the permanent development of new materials. In the automotive industry the occupant safety is an additional condition. Bringing these arguments together the preferable approach for lightweight design of car components, especially for body-in-white, is the use of modern steels. Such steel grades, also called advanced high strength steels (AHSS), exhibit a high strength as well as a high formability. Not only their material behavior but also the damage behavior of AHSS is different compared to the performances of standard steels. Conventional methods for the damage prediction in the industry like the forming limit curve (FLC) are not reliable for AHSS. Physically based damage models are often used in crash and bulk forming simulations. The still open question is the industrial application of these models for sheet metal forming. This paper evaluates the Gurson-Tvergaard-Needleman (GTN) model and the model of Lemaitre within commercial codes with a goal of industrial application.

  9. Degradation of ticarcillin by subcritial water oxidation method: Application of response surface methodology and artificial neural network modeling.

    Science.gov (United States)

    Yabalak, Erdal

    2018-05-18

    This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.

  10. Development and application of modeling tools for sodium fast reactor inspection

    Energy Technology Data Exchange (ETDEWEB)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan [CEA LIST, Centre de Saclay F-91191 Gif-sur-Yvette (France)

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  11. A Surface Modeling Paradigm for Electromagnetic Applications in Aerospace Structures

    OpenAIRE

    Jha, RM; Bokhari, SA; Sudhakar, V; Mahapatra, PR

    1989-01-01

    A systematic approach has been developed to model the surfaces encountered in aerospace engineering for EM applications. The basis of this modeling is the quadric canonical shapes which are the coordinate surfaces of the Eisenhart Coordinate systems. The building blocks are visualized as sections of quadric cylinders and surfaces of revolution. These truncated quadrics can successfully model realistic aerospace structures which are termed a s hybrid quadrics, of which the satellite launch veh...

  12. Applications of computational modeling in metabolic engineering of yeast

    DEFF Research Database (Denmark)

    Kerkhoven, Eduard J.; Lahtvee, Petri-Jaan; Nielsen, Jens

    2015-01-01

    a preferred flux distribution. These methods point to strategies for altering gene expression; however, fluxes are often controlled by post-transcriptional events. Moreover, GEMs are usually not taking into account metabolic regulation, thermodynamics and enzyme kinetics. To facilitate metabolic engineering......, it is necessary to expand the modeling of metabolism to consider kinetics of individual processes. This review will give an overview about models available for metabolic engineering of yeast and discusses their applications....

  13. Recent developments in volatility modeling and applications

    Directory of Open Access Journals (Sweden)

    A. Thavaneswaran

    2006-01-01

    Full Text Available In financial modeling, it has been constantly pointed out that volatility clustering and conditional nonnormality induced leptokurtosis observed in high frequency data. Financial time series data are not adequately modeled by normal distribution, and empirical evidence on the non-normality assumption is well documented in the financial literature (details are illustrated by Engle (1982 and Bollerslev (1986. An ARMA representation has been used by Thavaneswaran et al., in 2005, to derive the kurtosis of the various class of GARCH models such as power GARCH, non-Gaussian GARCH, nonstationary and random coefficient GARCH. Several empirical studies have shown that mixture distributions are more likely to capture heteroskedasticity observed in high frequency data than normal distribution. In this paper, some results on moment properties are generalized to stationary ARMA process with GARCH errors. Application to volatility forecasts and option pricing are also discussed in some detail.

  14. Novel Method for Superposing 3D Digital Models for Monitoring Orthodontic Tooth Movement.

    Science.gov (United States)

    Schmidt, Falko; Kilic, Fatih; Piro, Neltje Emma; Geiger, Martin Eberhard; Lapatki, Bernd Georg

    2018-04-18

    Quantitative three-dimensional analysis of orthodontic tooth movement (OTM) is possible by superposition of digital jaw models made at different times during treatment. Conventional methods rely on surface alignment at palatal soft-tissue areas, which is applicable to the maxilla only. We introduce two novel numerical methods applicable to both maxilla and mandible. The OTM from the initial phase of multi-bracket appliance treatment of ten pairs of maxillary models were evaluated and compared with four conventional methods. The median range of deviation of OTM for three users was 13-72% smaller for the novel methods than for the conventional methods, indicating greater inter-observer agreement. Total tooth translation and rotation were significantly different (ANOVA, p < 0.01) for OTM determined by use of the two numerical and four conventional methods. Directional decomposition of OTM from the novel methods showed clinically acceptable agreement with reference results except for vertical translations (deviations of medians greater than 0.6 mm). The difference in vertical translational OTM can be explained by maxillary vertical growth during the observation period, which is additionally recorded by conventional methods. The novel approaches are, thus, particularly suitable for evaluation of pure treatment effects, because growth-related changes are ignored.

  15. A complex autoregressive model and application to monthly temperature forecasts

    Directory of Open Access Journals (Sweden)

    X. Gu

    2005-11-01

    Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.

  16. gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.

    Science.gov (United States)

    Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil

    2018-04-01

    Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.

  17. An application to model traffic intensity of agricultural machinery at field scale

    Science.gov (United States)

    Augustin, Katja; Kuhwald, Michael; Duttmann, Rainer

    2017-04-01

    Several soil-pressure-models deal with the impact of agricultural machines on soils. In many cases, these models were used for single spots and consider a static machine configuration. Therefore, a statement about the spatial distribution of soil compaction risk for entire working processes is limited. The aim of the study is the development of an application for the spatial modelling of traffic lanes from agricultural vehicles including wheel load, ground pressure and wheel passages at the field scale. The application is based on Open Source software, application and data formats, using python programming language. Minimum input parameters are GPS-positions, vehicles and tires (producer and model) and the tire inflation pressure. Five working processes were distinguished: soil tillage, manuring, plant protection, sowing and harvest. Currently, two different models (Diserens 2009, Rücknagel et al. 2015) were implemented to calculate the soil pressure. The application was tested at a study site in Lower Saxony, Germany. Since 2015, field traffic were recorded by RTK-GPS and used machine set ups were noted. Using these input information the traffic lanes, wheel load and soil pressure were calculated for all working processes. For instance, the maize harvest in 2016 with a crop chopper and one transport vehicle crossed about 55 % of the total field area. At some places the machines rolled over up to 46 times. Approximately 35 % of the total area was affected by wheel loads over 7 tons and soil pressures between 163 and 193 kPa. With the information about the spatial distribution of wheel passages, wheel load and soil pressure it is possible to identify hot spots of intensive field traffic. Additionally, the use of the application enables the analysis of soil compaction risk induced by agricultural machines for long- and short-term periods.

  18. Application of multiple objective models to water resources planning and management

    International Nuclear Information System (INIS)

    North, R.M.

    1993-01-01

    Over the past 30 years, we have seen the birth and growth of multiple objective analysis from an idea without tools to one with useful applications. Models have been developed and applications have been researched to address the multiple purposes and objectives inherent in the development and management of water resources. A practical approach to multiple objective modelling incorporates macroeconomic-based policies and expectations in order to optimize the results from both engineering (structural) and management (non-structural) alternatives, while taking into account the economic and environmental trade-offs. (author). 27 refs, 4 figs, 3 tabs

  19. Modeling Phosphorous Losses from Seasonal Manure Application Schemes

    Science.gov (United States)

    Menzies, E.; Walter, M. T.

    2015-12-01

    Excess nutrient loading, especially nitrogen and phosphorus, to surface waters is a common and significant problem throughout the United States. While pollution remediation efforts are continuously improving, the most effective treatment remains to limit the source. Appropriate timing of fertilizer application to reduce nutrient losses is currently a hotly debated topic in the Northeastern United States; winter spreading of manure is under special scrutiny. We plan to evaluate the loss of phosphorous to surface waters from agricultural systems under varying seasonal fertilization schemes in an effort to determine the impacts of fertilizers applied throughout the year. The Cayuga Lake basin, located in the Finger Lakes region of New York State, is a watershed dominated by agriculture where a wide array of land management strategies can be found. The evaluation will be conducted on the Fall Creek Watershed, a large sub basin in the Cayuga Lake Watershed. The Fall Creek Watershed covers approximately 33,000 ha in central New York State with approximately 50% of this land being used for agriculture. We plan to use the Soil and Water Assessment Tool (SWAT) to model a number of seasonal fertilization regimes such as summer only spreading and year round spreading (including winter applications), as well as others. We will use the model to quantify the phosphorous load to surface waters from these different fertilization schemes and determine the impacts of manure applied at different times throughout the year. More detailed knowledge about how seasonal fertilization schemes impact phosphorous losses will provide more information to stakeholders concerning the impacts of agriculture on surface water quality. Our results will help farmers and extensionists make more informed decisions about appropriate timing of manure application for reduced phosphorous losses and surface water degradation as well as aid law makers in improving policy surrounding manure application.

  20. PNN-based Rockburst Prediction Model and Its Applications

    Directory of Open Access Journals (Sweden)

    Yu Zhou

    2017-07-01

    Full Text Available Rock burst is one of main engineering geological problems significantly threatening the safety of construction. Prediction of rock burst is always an important issue concerning the safety of workers and equipment in tunnels. In this paper, a novel PNN-based rock burst prediction model is proposed to determine whether rock burst will happen in the underground rock projects and how much the intensity of rock burst is. The probabilistic neural network (PNN is developed based on Bayesian criteria of multivariate pattern classification. Because PNN has the advantages of low training complexity, high stability, quick convergence, and simple construction, it can be well applied in the prediction of rock burst. Some main control factors, such as rocks’ maximum tangential stress, rocks’ uniaxial compressive strength, rocks’ uniaxial tensile strength, and elastic energy index of rock are chosen as the characteristic vector of PNN. PNN model is obtained through training data sets of rock burst samples which come from underground rock project in domestic and abroad. Other samples are tested with the model. The testing results agree with the practical records. At the same time, two real-world applications are used to verify the proposed method. The results of prediction are same as the results of existing methods, just same as what happened in the scene, which verifies the effectiveness and applicability of our proposed work.

  1. Bilayer Graphene Application on NO2 Sensor Modelling

    Directory of Open Access Journals (Sweden)

    Elnaz Akbari

    2014-01-01

    Full Text Available Graphene is one of the carbon allotropes which is a single atom thin layer with sp2 hybridized and two-dimensional (2D honeycomb structure of carbon. As an outstanding material exhibiting unique mechanical, electrical, and chemical characteristics including high strength, high conductivity, and high surface area, graphene has earned a remarkable position in today’s experimental and theoretical studies as well as industrial applications. One such application incorporates the idea of using graphene to achieve accuracy and higher speed in detection devices utilized in cases where gas sensing is required. Although there are plenty of experimental studies in this field, the lack of analytical models is felt deeply. To start with modelling, the field effect transistor- (FET- based structure has been chosen to serve as the platform and bilayer graphene density of state variation effect by NO2 injection has been discussed. The chemical reaction between graphene and gas creates new carriers in graphene which cause density changes and eventually cause changes in the carrier velocity. In the presence of NO2 gas, electrons are donated to the FET channel which is employed as a sensing mechanism. In order to evaluate the accuracy of the proposed models, the results obtained are compared with the existing experimental data and acceptable agreement is reported.

  2. New weighted sum of gray gases model applicable to Computational Fluid Dynamics (CFD) modeling of oxy-fuel combustion

    DEFF Research Database (Denmark)

    Yin, Chungen; Johansen, Lars Christian Riis; Rosendahl, Lasse

    2010-01-01

    gases model (WSGGM) is derived, which is applicable to computational fluid dynamics (CFD) modeling of both air-fuel and oxy-fuel combustion. First, a computer code is developed to evaluate the emissivity of any gas mixture at any condition by using the exponential wide band model (EWBM...

  3. Modelling and application of the inactivation of microorganism

    International Nuclear Information System (INIS)

    Oğuzhan, P.; Yangılar, F.

    2013-01-01

    Prevention of consuming contaminated food with toxic microorganisms causing infections and consideration of food protection and new microbial inactivation methods are obligatory situations. Food microbiology is mainly related with unwanted microorganisms spoiling foods during processing and transporting stages and causing diseases. Determination of pathogen microorganisms is important for human health to define and prevent dangers and elongate shelf life. Inactivation of pathogen microorganisms can provide food security and reduce nutrient losses. Microbial inactivation which is using methods of food protection such as food safety and fresh. With this aim, various methods are used such as classical thermal processes (pasteurisation, sterilisation), pressured electrical field (PEF), ionised radiation, high pressure, ultrasonic waves and plasma sterilisation. Microbial inactivation modelling is a secure and effective method in food production. A new microbiological application can give useful results for risk assessment in food, inactivation of microorganisms and improvement of shelf life. Application and control methods should be developed and supported by scientific research and industrial applications

  4. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  5. Twin support vector machines models, extensions and applications

    CERN Document Server

    Jayadeva; Chandra, Suresh

    2017-01-01

    This book provides a systematic and focused study of the various aspects of twin support vector machines (TWSVM) and related developments for classification and regression. In addition to presenting most of the basic models of TWSVM and twin support vector regression (TWSVR) available in the literature, it also discusses the important and challenging applications of this new machine learning methodology. A chapter on “Additional Topics” has been included to discuss kernel optimization and support tensor machine topics, which are comparatively new but have great potential in applications. It is primarily written for graduate students and researchers in the area of machine learning and related topics in computer science, mathematics, electrical engineering, management science and finance.

  6. Equicontrollability and its application to model-following and decoupling.

    Science.gov (United States)

    Curran, R. T.

    1971-01-01

    Discussion of 'model following,' a term used to describe a class of problems characterized by having two dynamic systems, generically known as the 'plant' and the 'model,' it being required to find a controller to attach to the plant so as to make the resultant compensated system behave, in an input/output sense, in the same way as the model. The approach presented to the problem takes a structural point of view. The result is a complex but informative definition which solves the problem as posed. The application of both the algorithm and its basis, equicontrollability, to the decoupling problem is considered.

  7. Calibration of a surface mass balance model for global-scale applications

    NARCIS (Netherlands)

    Giesen, R. H.; Oerlemans, J.

    2012-01-01

    Global applications of surface mass balance models have large uncertainties, as a result of poor climate input data and limited availability of mass balance measurements. This study addresses several possible consequences of these limitations for the modelled mass balance. This is done by applying a

  8. Comparing the Applicability of Commonly Used Hydrological Ecosystem Services Models for Integrated Decision-Support

    Directory of Open Access Journals (Sweden)

    Anna Lüke

    2018-01-01

    Full Text Available Different simulation models are used in science and practice in order to incorporate hydrological ecosystem services in decision-making processes. This contribution compares three simulation models, the Soil and Water Assessment Tool, a traditional hydrological model and two ecosystem services models, the Integrated Valuation of Ecosystem Services and Trade-offs model and the Resource Investment Optimization System model. The three models are compared on a theoretical and conceptual basis as well in a comparative case study application. The application of the models to a study area in Nicaragua reveals that a practical benefit to apply these models for different questions in decision-making generally exists. However, modelling of hydrological ecosystem services is associated with a high application effort and requires input data that may not always be available. The degree of detail in temporal and spatial variability in ecosystem service provision is higher when using the Soil and Water Assessment Tool compared to the two ecosystem service models. In contrast, the ecosystem service models have lower requirements on input data and process knowledge. A relationship between service provision and beneficiaries is readily produced and can be visualized as a model output. The visualization is especially useful for a practical decision-making context.

  9. Mathematical modeling for surface hardness in investment casting applications

    International Nuclear Information System (INIS)

    Singh, Rupinder

    2012-01-01

    Investment casting (IC) has many potential engineering applications. Not much work hitherto has been reported for modeling the surface hardness (SH) in IC of industrial components. In the present study, outcome of Taguchi based macro model has been used for developing a mathematical model for SH; using Buckingham's π theorem. Three input parameters namely volume/surface area (V/A) ratio of cast components, slurry layer's combination (LC) and molten metal pouring temperature were selected to give output in form of SH. This study will provide main effects of these variables on SH and will shed light on the SH mechanism in IC. The comparison with experimental results will also serve as further validation of model

  10. Sparse representation, modeling and learning in visual recognition theory, algorithms and applications

    CERN Document Server

    Cheng, Hong

    2015-01-01

    This unique text/reference presents a comprehensive review of the state of the art in sparse representations, modeling and learning. The book examines both the theoretical foundations and details of algorithm implementation, highlighting the practical application of compressed sensing research in visual recognition and computer vision. Topics and features: provides a thorough introduction to the fundamentals of sparse representation, modeling and learning, and the application of these techniques in visual recognition; describes sparse recovery approaches, robust and efficient sparse represen

  11. Studying and modelling variable density turbulent flows for industrial applications

    International Nuclear Information System (INIS)

    Chabard, J.P.; Simonin, O.; Caruso, A.; Delalondre, C.; Dalsecco, S.; Mechitoua, N.

    1996-07-01

    Industrial applications are presented in the various fields of interest for EDF. A first example deals with transferred electric arcs couplings flow and thermal transfer in the arc and in the bath of metal and is related with applications of electricity. The second one is the combustion modelling in burners of fossil power plants. The last one comes from the nuclear power plants and concerns the stratified flows in a nuclear reactor building. (K.A.)

  12. Applicability of models to estimate traffic noise for urban roads.

    Science.gov (United States)

    Melo, Ricardo A; Pimentel, Roberto L; Lacerda, Diego M; Silva, Wekisley M

    2015-01-01

    Traffic noise is a highly relevant environmental impact in cities. Models to estimate traffic noise, in turn, can be useful tools to guide mitigation measures. In this paper, the applicability of models to estimate noise levels produced by a continuous flow of vehicles on urban roads is investigated. The aim is to identify which models are more appropriate to estimate traffic noise in urban areas since several models available were conceived to estimate noise from highway traffic. First, measurements of traffic noise, vehicle count and speed were carried out in five arterial urban roads of a brazilian city. Together with geometric measurements of width of lanes and distance from noise meter to lanes, these data were input in several models to estimate traffic noise. The predicted noise levels were then compared to the respective measured counterparts for each road investigated. In addition, a chart showing mean differences in noise between estimations and measurements is presented, to evaluate the overall performance of the models. Measured Leq values varied from 69 to 79 dB(A) for traffic flows varying from 1618 to 5220 vehicles/h. Mean noise level differences between estimations and measurements for all urban roads investigated ranged from -3.5 to 5.5 dB(A). According to the results, deficiencies of some models are discussed while other models are identified as applicable to noise estimations on urban roads in a condition of continuous flow. Key issues to apply such models to urban roads are highlighted.

  13. Exponential Models of Legislative Turnover. [and] The Dynamics of Political Mobilization, I: A Model of the Mobilization Process, II: Deductive Consequences and Empirical Application of the Model. Applications of Calculus to American Politics. [and] Public Support for Presidents. Applications of Algebra to American Politics. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Units 296-300.

    Science.gov (United States)

    Casstevens, Thomas W.; And Others

    This document consists of five units which all view applications of mathematics to American politics. The first three view calculus applications, the last two deal with applications of algebra. The first module is geared to teach a student how to: 1) compute estimates of the value of the parameters in negative exponential models; and draw…

  14. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  15. SP@CE - An SP-based programming model for consumer electronics streaming applications

    NARCIS (Netherlands)

    Varbanescu, Ana Lucia; Nijhuis, Maik; Escribano, Arturo González; Sips, Henk; Bos, Herbert; Bal, Henri

    2007-01-01

    Efficient programming of multimedia streaming applications for Consumer Electronics (CE) devices is not trivial. As a solution for this problem, we present SP@CE, a novel programming model designed to balance the specific requirements of CE streaming applications with the simplicity and efficiency

  16. Applicability of the PROSPECT model for Norway spruce needles

    NARCIS (Netherlands)

    Malenovsky, Z.; Albrechtova, J.; Lhotakova, Z.; Zurita Milla, R.; Clevers, J.G.P.W.; Schaepman, M.E.; Cudlin, P.

    2006-01-01

    The potential applicability of the leaf radiative transfer model PROSPECT (version 3.01) was tested for Norway spruce (Picea abies (L.) Karst.) needles collected from stress resistant and resilient trees. Direct comparison of the measured and simulated leaf optical properties between 450¿1000 nm

  17. An application of artificial intelligence for rainfall–runoff modeling

    Indian Academy of Sciences (India)

    This study proposes an application of two techniques of artificial intelligence (AI) for rainfall–runoff modeling: the artificial neural networks (ANN) and the evolutionary computation (EC). Two diff- erent ANN techniques, the feed forward back propagation (FFBP) and generalized regression neural network (GRNN) methods ...

  18. STUDI KOMPARATIF MODEL PEMBELAJARAN THINK PAIR SQUARE DAN THINK PAIR SHARE TERHADAP MOTIVASI DAN HASIL BELAJAR SISWA MAPEL TIK KELAS X SMA N 1 SUKASADA

    Directory of Open Access Journals (Sweden)

    Putu Deli Januartini

    2016-10-01

    Abstract The purpose of this study were to determine (1 the significant influence of the application of think pair square and think pair share learning model on student’s learning achievement, (2 better learning achievement between think pair square and think pair share learning model, (3 student’s motivation, (4 the student’s responses. The research was a quasi-experimental design experiment with post test only control group design. The population of study was all the students in grade X. The sample were as X1 class with the application of Think Pair Square learning model, X3 class with the application of Think Pair Share learning model, and X5 class with the application of Direct Instruction learning model. The data was collected by cognitive and psychomotor tests. The student’s learning achievement were analyzed by the prerequisite test with the results of the three groups at normal distribution and homogenous, and the hypothesis tested by One Way Anova which means there is a significant effect on the application of think pair square, think pair share, and direct instruction learning models. Then it was conducted a further test t-Scheffe with the results there are differences in the learning achievement between think pair square, think pair share, and direct instruction learning models. According to the average result we made a conclusion that Think Pair Square was better learning models with higher student’s learning achievement. The questionnaires results shows that Think Pair Square was very high positive response and very high learning motivation, Think Pair Share was high positive response and very high learning motivation.   Keywords :   Think Pair Square, Think Pair Share, Direct Instruction, learning achievement, learning motivation, and student response.

  19. Applicability of Kinematic and Diffusive models for mud-flows: a steady state analysis

    Science.gov (United States)

    Di Cristo, Cristiana; Iervolino, Michele; Vacca, Andrea

    2018-04-01

    The paper investigates the applicability of Kinematic and Diffusive Wave models for mud-flows with a power-law shear-thinning rheology. In analogy with a well-known approach for turbulent clear-water flows, the study compares the steady flow depth profiles predicted by approximated models with those of the Full Dynamic Wave one. For all the models and assuming an infinitely wide channel, the analytical solution of the flow depth profiles, in terms of hypergeometric functions, is derived. The accuracy of the approximated models is assessed by computing the average, along the channel length, of the errors, for several values of the Froude and kinematic wave numbers. Assuming the threshold value of the error equal to 5%, the applicability conditions of the two approximations have been individuated for several values of the power-law exponent, showing a crucial role of the rheology. The comparison with the clear-water results indicates that applicability criteria for clear-water flows do not apply to shear-thinning fluids, potentially leading to an incorrect use of approximated models if the rheology is not properly accounted for.

  20. Polycrystalline CVD diamond device level modeling for particle detection applications

    Science.gov (United States)

    Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.

    2016-12-01

    Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.

  1. Polycrystalline CVD diamond device level modeling for particle detection applications

    International Nuclear Information System (INIS)

    Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.

    2016-01-01

    Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.

  2. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  3. Fuzzy bilevel programming with multiple non-cooperative followers: model, algorithm and application

    Science.gov (United States)

    Ke, Hua; Huang, Hu; Ralescu, Dan A.; Wang, Lei

    2016-04-01

    In centralized decision problems, it is not complicated for decision-makers to make modelling technique selections under uncertainty. When a decentralized decision problem is considered, however, choosing appropriate models is no longer easy due to the difficulty in estimating the other decision-makers' inconclusive decision criteria. These decision criteria may vary with different decision-makers because of their special risk tolerances and management requirements. Considering the general differences among the decision-makers in decentralized systems, we propose a general framework of fuzzy bilevel programming including hybrid models (integrated with different modelling methods in different levels). Specially, we discuss two of these models which may have wide applications in many fields. Furthermore, we apply the proposed two models to formulate a pricing decision problem in a decentralized supply chain with fuzzy coefficients. In order to solve these models, a hybrid intelligent algorithm integrating fuzzy simulation, neural network and particle swarm optimization based on penalty function approach is designed. Some suggestions on the applications of these models are also presented.

  4. MEASUREMENT FOR ACCEPTANCE OF SUPPLY CHAIN SIMULATOR APPLICATION USING TECHNOLOGY ACCEPTANCE MODEL

    Directory of Open Access Journals (Sweden)

    Mulyati E.

    2018-03-01

    Full Text Available The aim of this research for was to measure the user acceptance of simulator application which was built as a tool for student in learning of supply chain, particularly in bullwhip effect problem. The measurements used for the acceptance of supply chain simulator application in this research was the Technology Acceptance Model from 162 samples which were analyzed with Confirmatory Factor Analysis and Structural Equation Modelling. The result of this research indicated that the user acceptance (shown by customer participation of supply chain simulator was directly influence by perceived usefulness of supply chain simulator application used (positive and significant; the user acceptance of supply chain simulator was indirectly influenced by perceived ease of use in using supply chain simulator application (positive but not significant; the user acceptance of supply chain simulator was indirectly influenced by perceived enjoyment when the supply chain simulator application was used. The research would give a better understanding about a bullwhip effect and better experience for students, which would not be obtained through conventional learning, when the tools were not used.

  5. LINCOM wind flow model: Application to complex terrain with thermal stratification

    DEFF Research Database (Denmark)

    Dunkerley, F.; Moreno, J.; Mikkelsen, T.

    2001-01-01

    LINCOM is a fast linearised and spectral wind flow model for use over hilly terrain. It is designed to rapidly generate mean wind field predictions which provide input to atmospheric dispersion models and wind engineering applications. The thermal module, LINCOM-T, has recently been improved to p...

  6. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  7. Application of the actor model to large scale NDE data analysis

    Science.gov (United States)

    Coughlin, Chris

    2018-03-01

    The Actor model of concurrent computation discretizes a problem into a series of independent units or actors that interact only through the exchange of messages. Without direct coupling between individual components, an Actor-based system is inherently concurrent and fault-tolerant. These traits lend themselves to so-called "Big Data" applications in which the volume of data to analyze requires a distributed multi-system design. For a practical demonstration of the Actor computational model, a system was developed to assist with the automated analysis of Nondestructive Evaluation (NDE) datasets using the open source Myriad Data Reduction Framework. A machine learning model trained to detect damage in two-dimensional slices of C-Scan data was deployed in a streaming data processing pipeline. To demonstrate the flexibility of the Actor model, the pipeline was deployed on a local system and re-deployed as a distributed system without recompiling, reconfiguring, or restarting the running application.

  8. Fine‐Grained Mobile Application Clustering Model Using Retrofitted Document Embedding

    Directory of Open Access Journals (Sweden)

    Yeo‐Chan Yoon

    2017-08-01

    Full Text Available In this paper, we propose a fine‐grained mobile application clustering model using retrofitted document embedding. To automatically determine the clusters and their numbers with no predefined categories, the proposed model initializes the clusters based on title keywords and then merges similar clusters. For improved clustering performance, the proposed model distinguishes between an accurate clustering step with titles and an expansive clustering step with descriptions. During the accurate clustering step, an automatically tagged set is constructed as a result. This set is utilized to learn a high‐performance document vector. During the expansive clustering step, more applications are then classified using this document vector. Experimental results showed that the purity of the proposed model increased by 0.19, and the entropy decreased by 1.18, compared with the K‐means algorithm. In addition, the mean average precision improved by more than 0.09 in a comparison with a support vector machine classifier.

  9. Computational Modeling for Enhancing Soft Tissue Image Guided Surgery: An Application in Neurosurgery.

    Science.gov (United States)

    Miga, Michael I

    2016-01-01

    With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.

  10. On the applicability of deformed jellium model to the description of metal clusters

    DEFF Research Database (Denmark)

    Lyalin, Andrey G.; Matveentsev, Anton; Solov'yov, Ilia

    2003-01-01

    -density approximation deformed jellium model we have calculated the binding energies per atom, ionization potentials, deformation parameters and the optimized values of the Wigner-Seitz radii for neutral and singly charged sodium clusters with the number of atoms $N0$. These characteristics are compared...... shape deformations in the formation cluster properties and the quite reasonable level of applicability of the deformed jellium model.......This work is devoted to the elucidation the applicability of jellium model to the description of alkali cluster properties on the basis of comparison the jellium model results with those derived from experiment and within ab initio theoretical framework. On the basis of the Hartree-Fock and local...

  11. Specification, Estimation and Evaluation of Vector Smooth Transition Autoregressive Models with Applications

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    is illustrated by two applications. In the first one, the dynamic relationship between the US gasoline price and consumption is studied and possible asymmetries in it considered. The second application consists of modelling two well known Icelandic riverflow series, previously considered by many hydrologists...

  12. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  13. Code Development for Control Design Applications: Phase I: Structural Modeling

    International Nuclear Information System (INIS)

    Bir, G. S.; Robinson, M.

    1998-01-01

    The design of integrated controls for a complex system like a wind turbine relies on a system model in an explicit format, e.g., state-space format. Current wind turbine codes focus on turbine simulation and not on system characterization, which is desired for controls design as well as applications like operating turbine model analysis, optimal design, and aeroelastic stability analysis. This paper reviews structural modeling that comprises three major steps: formation of component equations, assembly into system equations, and linearization

  14. Bacteriophages: update on application as models for viruses in water

    African Journals Online (AJOL)

    Bacteriophages: update on application as models for viruses in water. ... the resistance of human viruses to water treatment and disinfection processes. ... highly sensitive molecular techniques viruses have been detected in drinking water ...

  15. Taguchi method for partial differential equations with application in tumor growth.

    Science.gov (United States)

    Ilea, M; Turnea, M; Rotariu, M; Arotăriţei, D; Popescu, Marilena

    2014-01-01

    The growth of tumors is a highly complex process. To describe this process, mathematical models are needed. A variety of partial differential mathematical models for tumor growth have been developed and studied. Most of those models are based on the reaction-diffusion equations and mass conservation law. A variety of modeling strategies have been developed, each focusing on tumor growth. Systems of time-dependent partial differential equations occur in many branches of applied mathematics. The vast majority of mathematical models in tumor growth are formulated in terms of partial differential equations. We propose a mathematical model for the interactions between these three cancer cell populations. The Taguchi methods are widely used by quality engineering scientists to compare the effects of multiple variables, together with their interactions, with a simple and manageable experimental design. In Taguchi's design of experiments, variation is more interesting to study than the average. First, Taguchi methods are utilized to search for the significant factors and the optimal level combination of parameters. Except the three parameters levels, other factors levels other factors levels would not be considered. Second, cutting parameters namely, cutting speed, depth of cut, and feed rate are designed using the Taguchi method. Finally, the adequacy of the developed mathematical model is proved by ANOVA. According to the results of ANOVA, since the percentage contribution of the combined error is as small. Many mathematical models can be quantitatively characterized by partial differential equations. The use of MATLAB and Taguchi method in this article illustrates the important role of informatics in research in mathematical modeling. The study of tumor growth cells is an exciting and important topic in cancer research and will profit considerably from theoretical input. Interpret these results to be a permanent collaboration between math's and medical oncologists.

  16. Modeling Types of Pedal Applications Using a Driving Simulator.

    Science.gov (United States)

    Wu, Yuqing; Boyle, Linda Ng; McGehee, Daniel; Roe, Cheryl A; Ebe, Kazutoshi; Foley, James

    2015-11-01

    The aim of this study was to examine variations in drivers' foot behavior and identify factors associated with pedal misapplications. Few studies have focused on the foot behavior while in the vehicle and the mishaps that a driver can encounter during a potentially hazardous situation. A driving simulation study was used to understand how drivers move their right foot toward the pedals. The study included data from 43 drivers as they responded to a series of rapid traffic signal phase changes. Pedal application types were classified as (a) direct hit, (b) hesitated, (c) corrected trajectory, and (d) pedal errors (incorrect trajectories, misses, slips, or pressed both pedals). A mixed-effects multinomial logit model was used to predict the likelihood of one of these pedal applications, and linear mixed models with repeated measures were used to examine the response time and pedal duration given the various experimental conditions (stimuli color and location). Younger drivers had higher probabilities of direct hits when compared to other age groups. Participants tended to have more pedal errors when responding to a red signal or when the signal appeared to be closer. Traffic signal phases and locations were associated with pedal response time and duration. The response time and pedal duration affected the likelihood of being in one of the four pedal application types. Findings from this study suggest that age-related and situational factors may play a role in pedal errors, and the stimuli locations could affect the type of pedal application. © 2015, Human Factors and Ergonomics Society.

  17. A parameter for the selection of an optimum balance calibration model by Monte Carlo simulation

    CSIR Research Space (South Africa)

    Bidgood, Peter M

    2013-09-01

    Full Text Available The current trend in balance calibration-matrix generation is to use non-linear regression and statistical methods. Methods typically include Modified-Design-of-Experiment (MDOE), Response-Surface-Models (RSMs) and Analysis of Variance (ANOVA...

  18. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  19. The application of XML in the effluents data modeling of nuclear facilities

    International Nuclear Information System (INIS)

    Yue Feng; Lin Quanyi; Yue Huiguo; Zhang Yan; Zhang Peng; Cao Jun; Chen Bo

    2013-01-01

    The radioactive effluent data, which can provide information to distinguish whether facilities, waste disposal, and control system run normally, is an important basis of safety regulation and emergency management. It can also provide the information to start emergency alarm system as soon as possible. XML technology is an effective tool to realize the standard of effluent data exchange, in favor of data collection, statistics and analysis, strengthening the effectiveness of effluent regulation. This paper first introduces the concept of XML, the choices of effluent data modeling method, and then emphasizes the process of effluent model, finally the model and application are shown, While there is deficiency about the application of XML in the effluents data modeling of nuclear facilities, it is a beneficial attempt to the informatization management of effluents. (authors)

  20. Application of mixed models for the assessment genotype and ...

    African Journals Online (AJOL)

    Application of mixed models for the assessment genotype and environment interactions in cotton ( Gossypium hirsutum ) cultivars in Mozambique. ... The cultivars ISA 205, STAM 42 and REMU 40 showed superior productivity when they were selected by the Harmonic Mean of Genotypic Values (HMGV) criterion in relation ...

  1. Application of the rainfall infiltration breakthrough (RIB) model for ...

    African Journals Online (AJOL)

    Application of the rainfall infiltration breakthrough (RIB) model for groundwater recharge estimation in west coastal South Africa. ... the data from Oudebosch with different rainfall and groundwater abstraction inputs are simulated to explore individual effects on water levels as well as recharge rate estimated on a daily basis.

  2. ONE WAY ANOVA RANDOMIZED COMPLETE BLOCKS

    African Journals Online (AJOL)

    ******

    2012-04-24

    Apr 24, 2012 ... Key words: Grey mullet, growth, foreign DNA, genetically modified. INTRODUCTION ... ration, food quality and preservation (Shears et al., 1991;. Chen et al. ... fish eggs (Khoo et al., 1992) and 4) direct injection of foreign DNA ...

  3. Default Bayes factors for ANOVA designs

    NARCIS (Netherlands)

    Rouder, Jeffrey N.; Morey, Richard D.; Speckman, Paul L.; Province, Jordan M.

    2012-01-01

    Bayes factors have been advocated as superior to p-values for assessing statistical evidence in data. Despite the advantages of Bayes factors and the drawbacks of p-values, inference by p-values is still nearly ubiquitous. One impediment to the adoption of Bayes factors is a lack of practical

  4. Good Modeling Practice for PAT Applications: Propagation of Input Uncertainty and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Eliasson Lantz, Anna

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input...... compared to the large uncertainty observed in the antibiotic and off-gas CO2 predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which...... promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. © 2009 American Institute...

  5. An evaluation of gas release modelling approaches as to their applicability in fuel behaviour models

    International Nuclear Information System (INIS)

    Mattila, L.J.; Sairanen, R.T.

    1980-01-01

    The release of fission gas from uranium oxide fuel to the voids in the fuel rod affects in many ways the behaviour of LWR fuel rods both during normal operating conditions including anticipated transients and during off-normal and accident conditions. The current trend towards significantly increased discharge burnup of LWR fuel will increase the importance of fission gas release considerations both from the design and safety viewpoints. In the paper fission gas release models are classified to 5 categories on the basis of complexity and physical sophistication. For each category, the basic approach common to the models included in the category is described, a few representative models of the category are singled out and briefly commented in some cases, the advantages and drawbacks of the approach are listed and discussed and conclusions on the practical feasibility of the approach are drawn. The evaluation is based on both literature survey and our experience in working with integral fuel behaviour models. The work has included verification efforts, attempts to improve certain features of the codes and engineering applications. The classification of fission gas release models regarding their applicability in fuel behaviour codes can of course be done only in a coarse manner. The boundaries between the different categories are vague and a model may be well refined in a way which transfers it to a higher category. Some current trends in fuel behaviour research are discussed which seem to motivate further extensive efforts in fission product release modelling and are certain to affect the prioritizing of the efforts. (author)

  6. Quantitative Structure-Use Relationship Model thresholds for Model Validation, Domain of Applicability, and Candidate Alternative Selection

    Data.gov (United States)

    U.S. Environmental Protection Agency — This file contains value of the model training set confusion matrix, domain of applicability evaluation based on training set to predicted chemicals structural...

  7. On Helical Projection and Its Application in Screw Modeling

    Directory of Open Access Journals (Sweden)

    Riliang Liu

    2014-04-01

    Full Text Available As helical surfaces, in their many and varied forms, are finding more and more applications in engineering, new approaches to their efficient design and manufacture are desired. To that end, the helical projection method that uses curvilinear projection lines to map a space object to a plane is examined in this paper, focusing on its mathematical model and characteristics in terms of graphical representation of helical objects. A number of interesting projective properties are identified in regard to straight lines, curves, and planes, and then the method is further investigated with respect to screws. The result shows that the helical projection of a cylindrical screw turns out to be a Jordan curve, which is determined by the screw's axial profile and number of flights. Based on the projection theory, a practical approach to the modeling of screws and helical surfaces is proposed and illustrated with examples, and its possible application in screw manufacturing is discussed.

  8. Crop model application to soybean irrigation management in the mid-south USA

    Science.gov (United States)

    Since mid 1990s, there have been a rapid development and application of crop growth models such as APEX (the Agricultural Policy/Environmental eXtender) and RZWQM2 (Root Zone Water Quality Model). Such process-oriented models have been designed to study the interactions of genetypes, weather, soil, ...

  9. Nuclear model developments in FLUKA for present and future applications

    Science.gov (United States)

    Cerutti, Francesco; Empl, Anton; Fedynitch, Anatoli; Ferrari, Alfredo; Ruben, GarciaAlia; Sala, Paola R.; Smirnov, George; Vlachoudis, Vasilis

    2017-09-01

    The FLUKAS code [1-3] is used in research laboratories all around the world for challenging applications spanning a very wide range of energies, projectiles and targets. FLUKAS is also extensively used for in hadrontherapy research studies and clinical planning systems. In this paper some of the recent developments in the FLUKAS nuclear physics models of relevance for very different application fields including medical physics are presented. A few examples are shown demonstrating the effectiveness of the upgraded code.

  10. Studying and modelling variable density turbulent flows for industrial applications

    Energy Technology Data Exchange (ETDEWEB)

    Chabard, J.P.; Simonin, O.; Caruso, A.; Delalondre, C.; Dalsecco, S.; Mechitoua, N.

    1996-07-01

    Industrial applications are presented in the various fields of interest for EDF. A first example deals with transferred electric arcs couplings flow and thermal transfer in the arc and in the bath of metal and is related with applications of electricity. The second one is the combustion modelling in burners of fossil power plants. The last one comes from the nuclear power plants and concerns the stratified flows in a nuclear reactor building. (K.A.). 18 refs.

  11. A novel modular multilevel converter modelling technique based on semi-analytical models for HVDC application

    Directory of Open Access Journals (Sweden)

    Ahmed Zama

    2016-12-01

    Full Text Available Thanks to scalability, performance and efficiency, the Modular Multilevel Converter (MMC, since its invention, becomes an attractive topology in industrial applications such as high voltage direct current (HVDC transmission system. However, modelling challenges related to the high number of switching elements in the MMC are highlighted when such systems are integrated into large simulated networks for stability or protection algorithms testing. In this work, a novel dynamic models for MMC is proposed. The proposed models are intended to simplify modeling challenges related to the high number of switching elements in the MMC. The models can be easily used to simulate the converter for stability analysis or protection algorithms for HVDC grids.

  12. Recommendations for analysis of repeated-measures designs: testing and correcting for sphericity and use of manova and mixed model analysis.

    Science.gov (United States)

    Armstrong, Richard A

    2017-09-01

    A common experimental design in ophthalmic research is the repeated-measures design in which at least one variable is a within-subject factor. This design is vulnerable to lack of 'sphericity' which assumes that the variances of the differences among all possible pairs of within-subject means are equal. Traditionally, this design has been analysed using a repeated-measures analysis of variance (RM-anova) but increasingly more complex methods such as multivariate anova (manova) and mixed model analysis (MMA) are being used. This article surveys current practice in the analysis of designs incorporating different factors in research articles published in three optometric journals, namely Ophthalmic and Physiological Optics (OPO), Optometry and Vision Science (OVS), and Clinical and Experimental Optometry (CXO), and provides advice to authors regarding the analysis of repeated-measures designs. Of the total sample of articles, 66% used a repeated-measures design. Of those articles using a repeated-measures design, 59% and 8% analysed the data using RM-anova or manova respectively and 33% used MMA. The use of MMA relative to RM-anova has increased significantly since 2009/10. A further search using terms to select those papers testing and correcting for sphericity ('Mauchly's test', 'Greenhouse-Geisser', 'Huynh and Feld') identified 66 articles, 62% of which were published from 2012 to the present. If the design is balanced without missing data then manova should be used rather than RM-anova as it gives better protection against lack of sphericity. If the design is unbalanced or with missing data then MMA is the method of choice. However, MMA is a more complex analysis and can be difficult to set up and run, and care should be taken first, to define appropriate models to be tested and second, to ensure that sample sizes are adequate. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  13. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  14. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  15. Active Brownian motion models and applications to ratchets

    Science.gov (United States)

    Fiasconaro, A.; Ebeling, W.; Gudowska-Nowak, E.

    2008-10-01

    We give an overview over recent studies on the model of Active Brownian Motion (ABM) coupled to reservoirs providing free energy which may be converted into kinetic energy of motion. First, we present an introduction to a general concept of active Brownian particles which are capable to take up energy from the source and transform part of it in order to perform various activities. In the second part of our presentation we consider applications of ABM to ratchet systems with different forms of differentiable potentials. Both analytical and numerical evaluations are discussed for three cases of sinusoidal, staircaselike and Mateos ratchet potentials, also with the additional loads modelled by tilted potential structure. In addition, stochastic character of the kinetics is investigated by considering perturbation by Gaussian white noise which is shown to be responsible for driving the directionality of the asymptotic flux in the ratchet. This stochastically driven directionality effect is visualized as a strong nonmonotonic dependence of the statistics of the right versus left trajectories of motion leading to a net current of particles. Possible applications of the ratchet systems to molecular motors are also briefly discussed.

  16. Instructional Storytelling: Application of the Clinical Judgment Model in Nursing.

    Science.gov (United States)

    Timbrell, Jessica

    2017-05-01

    Little is known about the teaching and learning implications of instructional storytelling (IST) in nursing education or its potential connection to nursing theory. The literature establishes storytelling as a powerful teaching-learning method in the educational, business, humanities, and health sectors, but little exploration exists that is specific to nursing. An example of a story demonstrating application of the domains of Tanner's clinical judgment model links storytelling with learning outcomes appropriate for the novice nursing student. Application of Tanner's clinical judgment model offers consistency of learning experience while preserving the creativity inherent in IST. Further research into student learning outcomes achievement using IST is warranted as a step toward establishing best practices with IST in nursing education. [J Nurs Educ. 2017;56(5):305-308.]. Copyright 2017, SLACK Incorporated.

  17. Option Price Decomposition in Spot-Dependent Volatility Models and Some Applications

    Directory of Open Access Journals (Sweden)

    Raúl Merino

    2017-01-01

    Full Text Available We obtain a Hull and White type option price decomposition for a general local volatility model. We apply the obtained formula to CEV model. As an application we give an approximated closed formula for the call option price under a CEV model and an approximated short term implied volatility surface. These approximated formulas are used to estimate model parameters. Numerical comparison is performed for our new method with exact and approximated formulas existing in the literature.

  18. Critical properties of the double-frequency sine-Gordon model with applications

    International Nuclear Information System (INIS)

    Fabrizio, M.; Gogolin, A.O.; Nersesyan, A.A.

    2000-01-01

    We study the properties of the double-frequency sine-Gordon model in the vicinity of the Ising quantum phase transition displayed by this model. Using a mapping onto a generalized lattice quantum Ashkin-Teller model, we obtain critical and nearly-off-critical correlation functions of various operators. We discuss applications of the double-sine-Gordon model to one-dimensional physical systems, like spin chains in a staggered external field and interacting electrons in a staggered potential

  19. Optimizing Injection Molding Parameters of Different Halloysites Type-Reinforced Thermoplastic Polyurethane Nanocomposites via Taguchi Complemented with ANOVA

    Directory of Open Access Journals (Sweden)

    Tayser Sumer Gaaz

    2016-11-01

    Taguchi and ANOVA approaches. Seemingly, mHNTs has shown its very important role in the resulting product.

  20. Predictive market segmentation model: An application of logistic regression model and CHAID procedure

    Directory of Open Access Journals (Sweden)

    Soldić-Aleksić Jasna

    2009-01-01

    Full Text Available Market segmentation presents one of the key concepts of the modern marketing. The main goal of market segmentation is focused on creating groups (segments of customers that have similar characteristics, needs, wishes and/or similar behavior regarding the purchase of concrete product/service. Companies can create specific marketing plan for each of these segments and therefore gain short or long term competitive advantage on the market. Depending on the concrete marketing goal, different segmentation schemes and techniques may be applied. This paper presents a predictive market segmentation model based on the application of logistic regression model and CHAID analysis. The logistic regression model was used for the purpose of variables selection (from the initial pool of eleven variables which are statistically significant for explaining the dependent variable. Selected variables were afterwards included in the CHAID procedure that generated the predictive market segmentation model. The model results are presented on the concrete empirical example in the following form: summary model results, CHAID tree, Gain chart, Index chart, risk and classification tables.

  1. Swelling of polymer networks with topological constraints: Application of the Helmis-Heinrich-Straube model

    Directory of Open Access Journals (Sweden)

    B. Basterra-Beroiz

    2018-08-01

    Full Text Available For the first time since its formulation in 1986, the theoretical approach proposed by Helmis, Heinrich and Straube (HHS model, which considers the contribution of topological restrictions from entanglements to the swelling of polymer networks, is applied to experimental data. The main aspects and key equations are reviewed and their application is illustrated for unfilled rubber compounds. The HHS model is based on real networks and gives new perspectives to the interpretation of experimental swelling data for which the entanglement contributions are usually neglected by considering phantom network models. This investigation applies a reliable constrained-chain approach through a deformation-dependent tube model for defining the elastic contribution of swollen networks, which is one of the main limitations on the applicability of classical (affine Flory-Rehner and (non-affine phantom models. This short communication intends to provide a baseline for the application and validation of this modern approach for a broader class of rubber materials.

  2. Bridging the Radiative Transfer Models for Meteorology and Solar Energy Applications

    Science.gov (United States)

    Xie, Y.; Sengupta, M.

    2017-12-01

    Radiative transfer models are used to compute solar radiation reaching the earth surface and play an important role in both meteorology and solar energy studies. Therefore, they are designed to meet the needs of specialized applications. For instance, radiative transfer models for meteorology seek to provide more accurate cloudy-sky radiation compared to models used in solar energy that are geared towards accuracy in clear-sky conditions associated with the maximum solar resource. However, models for solar energy applications are often computationally faster, as the complex solution of the radiative transfer equation is parameterized by atmospheric properties that can be acquired from surface- or satellite-based observations. This study introduces the National Renewable Energy Laboratory's (NREL's) recent efforts to combine the advantages of radiative transfer models designed for meteorology and solar energy applictions. A fast all-sky radiation model, FARMS-NIT, was developed to efficiently compute narrowband all-sky irradiances over inclined photovoltaic (PV) panels. This new model utilizes the optical preperties from a solar energy model, SMARTS, to computes surface radiation by considering all possible paths of photon transmission and the relevent scattering and absorption attenuation. For cloudy-sky conditions, cloud bidirectional transmittance functions (BTDFs) are provided by a precomputed lookup table (LUT) by LibRadtran. Our initial results indicate that FARMS-NIT has an accuracy that is similar to LibRadtran, a highly accurate multi-stream model, but is significantly more efficient. The development and validation of this model will be presented.

  3. Advances in Photonics Design and Modeling for Nano- and Bio-photonics Applications

    DEFF Research Database (Denmark)

    Tanev, Stoyan

    2010-01-01

    In this invited paper we focus on the discussion of two recent unique applications of the Finite-Difference Time-Domain (FDTD) simulation method to the design and modeling of advanced nano- and bio-photonic problems. We will first discuss the application of a traditional formulation of the FDTD...

  4. Using a Hydrological Model to Determine Environmentally Safer Windows for Herbicide Application

    Science.gov (United States)

    J.L. Michael; M.C. Smith; W.G. Knisel; D.G. Neary; W.P. Fowler; D.J. Turton

    1996-01-01

    A modification of the GLEAMS model was used to determine application windows which would optimise efficacy and environmental safety for herbicide application to a forest site. Herbicide/soil partition coefficients were determined using soil samples collected from the study site for two herbicides (imazapyr, Koc=46, triclopyr ester, K

  5. Dosimetric applications of the new ICRP lung model

    International Nuclear Information System (INIS)

    James, A.C.

    1994-06-01

    The International Commission on Radiological Protection (ICRP) has adopted a new dosimetric model of the human respiratory tract, to be issued as ICRP Publication 66. This chapter presents a summary of the main measures of the new model. The model is a general update of that in Publication 30, but is significantly broader in scope. It applies explicitly to workers and all members of the public: for inhalation of particles, gases and vapors; evaluation of dose per unit intake or exposure; and interpretation of bioassay data. The approach is fundamentally different from the Publication 30 model which calculates only the average dose to the lungs. The new model takes account of differences in radiosensitivity of respiratory tract tissues, and the wide range of doses they may receive, and calculates specific tissue doses. The model readily incorporates specific information related to the subject (age, physical activity, smoking or health status) or the exposure (aerosol size and chemical form). The application of the new model to calculate equivalent lung dose and effective dose per unit intake is illustrated for several α- and ∂-emitting radionuclides, and the new values obtained are compared with those given by the ICRP Publication 30 lung model

  6. Combining UML2 Application and SystemC Platform Modelling for Performance Evaluation of Real-Time Embedded Systems

    Directory of Open Access Journals (Sweden)

    Qu Yang

    2008-01-01

    Full Text Available Abstract Future mobile devices will be based on heterogeneous multiprocessing platforms accommodating several stand-alone applications. The network-on-chip communication and device networking combine the design challenges of conventional distributed systems and resource constrained real-time embedded systems. Interoperable design space exploration for both the application and platform development is required. Application designer needs abstract platform models to rapidly check the feasibility of a new feature or application. Platform designer needs abstract application models for defining platform computation and communication capacities. We propose a layered UML application/workload and SystemC platform modelling approach that allow application and platform to be modelled at several levels of abstraction, which enables early performance evaluation of the resulting system. The overall approach has been experimented with a mobile video player case study, while different load extraction methods have been validated by applying them to MPEG-4 encoder, Quake2 3D game, and MP3 decoder case studies previously.

  7. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  8. Jet Noise Modeling for Supersonic Business Jet Application

    Science.gov (United States)

    Stone, James R.; Krejsa, Eugene A.; Clark, Bruce J.

    2004-01-01

    This document describes the development of an improved predictive model for coannular jet noise, including noise suppression modifications applicable to small supersonic-cruise aircraft such as the Supersonic Business Jet (SBJ), for NASA Langley Research Center (LaRC). For such aircraft a wide range of propulsion and integration options are under consideration. Thus there is a need for very versatile design tools, including a noise prediction model. The approach used is similar to that used with great success by the Modern Technologies Corporation (MTC) in developing a noise prediction model for two-dimensional mixer ejector (2DME) nozzles under the High Speed Research Program and in developing a more recent model for coannular nozzles over a wide range of conditions. If highly suppressed configurations are ultimately required, the 2DME model is expected to provide reasonable prediction for these smaller scales, although this has not been demonstrated. It is considered likely that more modest suppression approaches, such as dual stream nozzles featuring chevron or chute suppressors, perhaps in conjunction with inverted velocity profiles (IVP), will be sufficient for the SBJ.

  9. Nonequilibrium thermodynamic models and applications to hydrogen plasma

    International Nuclear Information System (INIS)

    Cho, K.Y.

    1988-01-01

    A generalized multithermal equilibrium (GMTE) thermodynamic model is developed and presented with applications to hydrogen. A new chemical equilibrium equation for GMTE is obtained without the ensemble temperature concept, used by a previous MTE model. The effects of the GMTE model on the derivation and calculation of the thermodynamic, transport, and radiative properties are presented and significant differences from local thermal equilibrium (LTE) and two temperature model are discussed. When the electron translational temperature (T e ) is higher than the translational temperature of the heavy particles, the effects of hydrogen molecular species to the properties are significant at high T e compared with LTE results. The density variations of minor species are orders of magnitude with kinetic nonequilibrium at a constant electron temperature. A collisional-radiative model is also developed with the GMTE chemical equilibrium equation to study the effects of radiative transfer and the ambipolar diffusion on the population distribution of the excited atoms. The nonlocal radiative transfer effect is parameterized by an absorption factor, which is defined as a ratio of the absorbed intensity to the spontaneous emission coefficient

  10. Some hybrid models applicable to dose-response relationships

    International Nuclear Information System (INIS)

    Kumazawa, Shigeru

    1992-01-01

    A new type of models of dose-response relationships has been studied as an initial stage to explore a reliable extrapolation of the relationships decided by high dose data to the range of low dose covered by radiation protection. The approach is to use a 'hybrid scale' of linear and logarithmic scales; the first model is that the normalized surviving fraction (ρ S > 0) in a hybrid scale decreases linearly with dose in a linear scale, and the second is that the induction in a log scale increases linearly with the normalized dose (τ D > 0) in a hybrid scale. The hybrid scale may reflect an overall effectiveness of a complex system against adverse events caused by various agents. Some data of leukemia in the atomic bomb survivors and of rodent experiments were used to show the applicability of hybrid scale models. The results proved that proposed models fit these data not less than the popular linear-quadratic models, providing the possible interpretation of shapes of dose-response curves, e.g. shouldered survival curves varied by recovery time. (author)

  11. Joint Models for Longitudinal and Time-to-Event Data With Applications in R

    CERN Document Server

    Rizopoulos, Dimitris

    2012-01-01

    In longitudinal studies it is often of interest to investigate how a marker that is repeatedly measured in time is associated with a time to an event of interest, e.g., prostate cancer studies where longitudinal PSA level measurements are collected in conjunction with the time-to-recurrence. Joint Models for Longitudinal and Time-to-Event Data: With Applications in R provides a full treatment of random effects joint models for longitudinal and time-to-event outcomes that can be utilized to analyze such data. The content is primarily explanatory, focusing on applications of joint modeling, but

  12. Application of Simple CFD Models in Smoke Ventilation Design

    DEFF Research Database (Denmark)

    Brohus, Henrik; Nielsen, Peter Vilhelm; la Cour-Harbo, Hans

    2004-01-01

    The paper examines the possibilities of using simple CFD models in practical smoke ventilation design. The aim is to assess if it is possible with a reasonable accuracy to predict the behaviour of smoke transport in case of a fire. A CFD code mainly applicable for “ordinary” ventilation design...

  13. An Application Of Receptor Modeling To Identify Airborne Particulate ...

    African Journals Online (AJOL)

    An Application Of Receptor Modeling To Identify Airborne Particulate Sources In Lagos, Nigeria. FS Olise, OK Owoade, HB Olaniyi. Abstract. There have been no clear demarcations between industrial and residential areas of Lagos with focus on industry as the major source. There is need to identify potential source types in ...

  14. An Application of Taylor Models to the Nakao Method on ODEs

    OpenAIRE

    Yamamoto, Nobito; Komori, Takashi

    2009-01-01

    The authors give short survey on validated computaion of initial value problems for ODEs especially Taylor model methods. Then they propose an application of Taylor models to the Nakao method which has been developed for numerical verification methods on PDEs and apply it to initial value problems for ODEs with some numerical experiments.

  15. Modelling of Argon Cold Atmospheric Plasmas for Biomedical Applications

    Science.gov (United States)

    Atanasova, M.; Benova, E.; Degrez, G.; van der Mullen, J. A. M.

    2018-02-01

    Plasmas for biomedical applications are one of the newest fields of plasma utilization. Especially high is the interest toward plasma usage in medicine. Promising results are achieved in blood coagulation, wound healing, treatment of some forms of cancer, diabetic complications, etc. However, the investigations of the biomedical applications from biological and medical viewpoint are much more advanced than the studies on the dynamics of the plasma. In this work we aim to address some specific challenges in the field of plasma modelling, arising from biomedical applications - what are the plasma reactive species’ and electrical fields’ spatial distributions as well as their production mechanisms; what are the fluxes and energies of the various components of the plasma delivers to the treated surfaces; what is the gas flow pattern? The focus is on two devices, namely the capacitive coupled plasma jet and the microwave surface wave sustained discharge. The devices are representatives of the so called cold atmospheric plasmas (CAPs). These are discharges characterized by low gas temperature - less than 40°C at the point of application - and non-equilibrium chemistry.

  16. Lifestyle of Employees working in Hamadan Departments: An Application of the Trans-Theoretical Model

    Directory of Open Access Journals (Sweden)

    Jalal Abdi

    2014-06-01

    Full Text Available Introduction: Healthy lifestyle is a valuable source to reduce the prevalence of health problems and promoting health. Given the key role of employees as valuable human resources, the aim of this study was to evaluate lifestyle obesity and position of governmental employees in changing process based on the Trans-Theoretical Model (TTM in Hamadan. Materials & Methods: This descriptive-analytical study was performed on 1200 government employees selected using suitable stratified sampling. Data collection was performed using a three-section questionnaire containing demographic characteristics, FANTASTIC lifestyle questionnaire and Marcus et al.’s five-part questionnaire. Data was analyzed by correlation tests, Chi-square, T-test and ANOVA using SPSS-20. Results: Lifestyle status of most employees (61.7 percent was satisfying. About a half of the employees were in the preparatory stage of TTM. Considering the physical activity and healthy eating habits, most employees had a poor condition. Women had higher scores than men in most items. The associations between lifestyle and age, gender, work experience, income satisfaction and marital status were significant. Moreover, the associations between obesity and work experience, marital status, number of children and gender were significant (p<0.05. Conclusion: Planning health education interventions for employees through effective approaches seems to be necessary.

  17. Modeling microbiological and chemical processes in municipal solid waste bioreactor, Part II: Application of numerical model BIOKEMOD-3P.

    Science.gov (United States)

    Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh

    2010-02-01

    Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.

  18. Application of Interval Predictor Models to Space Radiation Shielding

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy,Daniel P.; Norman, Ryan B.; Blattnig, Steve R.

    2016-01-01

    This paper develops techniques for predicting the uncertainty range of an output variable given input-output data. These models are called Interval Predictor Models (IPM) because they yield an interval valued function of the input. This paper develops IPMs having a radial basis structure. This structure enables the formal description of (i) the uncertainty in the models parameters, (ii) the predicted output interval, and (iii) the probability that a future observation would fall in such an interval. In contrast to other metamodeling techniques, this probabilistic certi cate of correctness does not require making any assumptions on the structure of the mechanism from which data are drawn. Optimization-based strategies for calculating IPMs having minimal spread while containing all the data are developed. Constraints for bounding the minimum interval spread over the continuum of inputs, regulating the IPMs variation/oscillation, and centering its spread about a target point, are used to prevent data over tting. Furthermore, we develop an approach for using expert opinion during extrapolation. This metamodeling technique is illustrated using a radiation shielding application for space exploration. In this application, we use IPMs to describe the error incurred in predicting the ux of particles resulting from the interaction between a high-energy incident beam and a target.

  19. Homogeneity tests for variances and mean test under heterogeneity conditions in a single way ANOVA method

    International Nuclear Information System (INIS)

    Morales P, J.R.; Avila P, P.

    1996-01-01

    If we have consider the maximum permissible levels showed for the case of oysters, it results forbidding to collect oysters at the four stations of the El Chijol Channel ( Veracruz, Mexico), as well as along the channel itself, because the metal concentrations studied exceed these limits. In this case the application of Welch tests were not necessary. For the water hyacinth the means of the treatments were unequal in Fe, Cu, Ni, and Zn. This case is more illustrative, for the conclusion has been reached through the application of the Welch tests to treatments with heterogeneous variances. (Author)

  20. Application of the Social Marketing Model to Unemployment Counseling: A Theoretical Perspective

    Science.gov (United States)

    Englert, Paul; Sommerville, Susannah; Guenole, Nigel

    2009-01-01

    A. R. Andreasen's (1995) social marketing model (SMM) is applied to structure feedback counseling for individuals who are unemployed. The authors discuss techniques used in commercial marketing and how they are equally applicable to solving societal problems; SMM and its application to social interventions; and structured feedback that moves a…

  1. Penerapan Aplikasi Program Penjualan Dan Pembelian Menggunakan Model Rapid Application Development

    Directory of Open Access Journals (Sweden)

    Annisa Febriani

    2017-09-01

    Abstract The development of information technology at the moment quickly and rapidly, supported by one means namely computer. Of course the computer has been equipped with a particular application is used to help facilitate the work of the man to manage the data of an organization or company so that getting accurate results and according to needs. The results of the observations that have been made, showed a sales and purchase activities are still using manual systems, one of them at a clothing store. Starting from the data processing of the goods, the difficulty of checking stock, purchase transaction, sales transactions, as well as other data storage associated with all types of such activities, so that it could make a loss for the store owner, errors in the logging and less akuratnya the report is made. Judging from the large number of transactions done on clothing stores, required system information more quickly and accurately. Thus, the author makes the program architecture-based computer, use the Microsoft Visual Basic.net programming language and the MySQL database, so that the information and activities that occur can be done quickly and accurately. The methods used in making architecture the program using the model of Rapid Application Development (RAD. This RAD model is an adaptation of the waterfall model for high speed version of the development of each component of its software. Results achieved from the discussion of this theme is the form of the application program selling and buying the ready-made. In this case, the use of the application program is the best solution to solve the existing problems, as well as with the use of application programs can be reached by an activity which is effective and efficient in supporting that activity, especially for addressing the problem of the sale and purchase of.   Keywords: Sales Program, Purchasing Program.

  2. Modeling and identification of induction micromachines in microelectromechanical systems applications

    Energy Technology Data Exchange (ETDEWEB)

    Lyshevski, S.E. [Purdue University at Indianapolis (United States). Dept. of Electrical and Computer Engineering

    2002-11-01

    Microelectromechanical systems (MEMS), which integrate motion microstructures, radiating energy microdevices, controlling and signal processing integrated circuits (ICs), are widely used. Rotational and translational electromagnetic based micromachines are used in MEMS as actuators and sensors. Brushless high performance micromachines are the preferable choice in different MEMS applications, and therefore, synchronous and induction micromachines are the best candidates. Affordability, good performance characteristics (efficiency, controllability, robustness, reliability, power and torque densities etc.) and expanded operating envelopes result in a strong interest in the application of induction micromachines. In addition, induction micromachines can be easily fabricated using surface micromachining and high aspect ratio fabrication technologies. Thus, it is anticipated that induction micromachines, controlled using different control algorithms implemented using ICs, will be widely used in MEMS. Controllers can be implemented using specifically designed ICs to attain superior performance, maximize efficiency and controllability, minimize losses and electromagnetic interference, reduce noise and vibration, etc. In order to design controllers, the induction micromachine must be modeled, and its mathematical model parameters must be identified. Using microelectromechanics, nonlinear mathematical models are derived. This paper illustrates the application of nonlinear identification methods as applied to identify the unknown parameters of three phase induction micromachines. Two identification methods are studied. In particular, nonlinear error mapping technique and least squares identification are researched. Analytical and numerical results, as well as practical capabilities and effectiveness, are illustrated, identifying the unknown parameters of a three phase brushless induction micromotor. Experimental results fully support the identification methods. (author)

  3. A Study on the Application Model of B2B E-Commerce in the Agricultural Sector

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jinlong; DU Xiaofang

    2004-01-01

    There are two main application models of B2B e-commerce, which are best suitable for agricultural sector. One is the e-market intermediation model (EMIM), and the other is the Integrative content center model (ICCM). Based on the analysis of these two models in application field of agriculture, a conclusion is drawn that these two models will be the main application ones of agricultural e-cornmerce at present, while ICCM will be a transition from local e-commerce to integrative e-commerce. The future development of agricultural e-commerce will follow the direction of integrative e-commerce which is based on the supply chain model on the E-Hubs. And a new framework of integrative e-commerce is presented as a conclusion at last.

  4. HYDROSCAPE: A SCAlable and ParallelizablE Rainfall Runoff Model for Hydrological Applications

    Science.gov (United States)

    Piccolroaz, S.; Di Lazzaro, M.; Zarlenga, A.; Majone, B.; Bellin, A.; Fiori, A.

    2015-12-01

    In this work we present HYDROSCAPE, an innovative streamflow routing method based on the travel time approach, and modeled through a fine-scale geomorphological description of hydrological flow paths. The model is designed aimed at being easily coupled with weather forecast or climate models providing the hydrological forcing, and at the same time preserving the geomorphological dispersion of the river network, which is kept unchanged independently on the grid size of rainfall input. This makes HYDROSCAPE particularly suitable for multi-scale applications, ranging from medium size catchments up to the continental scale, and to investigate the effects of extreme rainfall events that require an accurate description of basin response timing. Key feature of the model is its computational efficiency, which allows performing a large number of simulations for sensitivity/uncertainty analyses in a Monte Carlo framework. Further, the model is highly parsimonious, involving the calibration of only three parameters: one defining the residence time of hillslope response, one for channel velocity, and a multiplicative factor accounting for uncertainties in the identification of the potential maximum soil moisture retention in the SCS-CN method. HYDROSCAPE is designed with a simple and flexible modular structure, which makes it particularly prone to massive parallelization, customization according to the specific user needs and preferences (e.g., rainfall-runoff model), and continuous development and improvement. Finally, the possibility to specify the desired computational time step and evaluate streamflow at any location in the domain, makes HYDROSCAPE an attractive tool for many hydrological applications, and a valuable alternative to more complex and highly parametrized large scale hydrological models. Together with model development and features, we present an application to the Upper Tiber River basin (Italy), providing a practical example of model performance and

  5. Practical applications of age-dependent reliability models and analysis of operational data

    Energy Technology Data Exchange (ETDEWEB)

    Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L

    2005-07-01

    The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems.

  6. Practical applications of age-dependent reliability models and analysis of operational data

    International Nuclear Information System (INIS)

    Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L.

    2005-01-01

    The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems

  7. A bootstrap based space-time surveillance model with an application to crime occurrences

    Science.gov (United States)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  8. A Multiphase Non-Linear Mixed Effects Model: An Application to Spirometry after Lung Transplantation

    Science.gov (United States)

    Rajeswaran, Jeevanantham; Blackstone, Eugene H.

    2014-01-01

    In medical sciences, we often encounter longitudinal temporal relationships that are non-linear in nature. The influence of risk factors may also change across longitudinal follow-up. A system of multiphase non-linear mixed effects model is presented to model temporal patterns of longitudinal continuous measurements, with temporal decomposition to identify the phases and risk factors within each phase. Application of this model is illustrated using spirometry data after lung transplantation using readily available statistical software. This application illustrates the usefulness of our flexible model when dealing with complex non-linear patterns and time varying coefficients. PMID:24919830

  9. Application of a unified fatigue modelling to some thermomechanical fatigue problems

    International Nuclear Information System (INIS)

    Dang, K. van; Maitournam, H.; Moumni, Z.

    2005-01-01

    Fatigue under thermomechanical loadings is an important topic for nuclear industries. For instance, thermal fatigue cracking is observed in the mixing zones of the nuclear reactor. Classical computations using existing methods based on strain amplitude or fracture mechanics are not sufficiently predictive. In this paper an alternative approach is proposed based on a multiscale modelling thanks to shakedown hypothesis. Examples of predictive results are presented. Finally an application to the RHR problem is discussed. Main ideas of the fatigue modelling: Following an idea of Professor D. Drucker who wrote in 1963 'when applied to the microstructure there is a hope that the concept of endurance limit and shakedown are related, and that fatigue failure can be related to energy dissipated in idealized material when shakedown does not occur.' we have developed a theory of fatigue based on this concept which is different from classical fatigue approaches. Many predictive applications have been already done particularly for the automotive industry. Fatigue resistance of structures undergoing thermomechanical loadings in the high cycle regime as well as in the low cycle regime are calculated using this modelling. However, this fatigue theory is until now rarely used in nuclear engineering. After recalling the main points of the theory, we shall present some relevant applications which were done in different industrial sectors. We shall apply this modelling to the prediction of thermal cracking observed in the mixing zones of RHR. (authors)

  10. Application of multilinear regression analysis in modeling of soil ...

    African Journals Online (AJOL)

    The application of Multi-Linear Regression Analysis (MLRA) model for predicting soil properties in Calabar South offers a technical guide and solution in foundation designs problems in the area. Forty-five soil samples were collected from fifteen different boreholes at a different depth and 270 tests were carried out for CBR, ...

  11. Quantummechanical multi-step direct models for nuclear data applications

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-10-01

    Various multi-step direct models have been derived and compared on a theoretical level. Subsequently, these models have been implemented in the computer code system KAPSIES, enabling a consistent comparison on the basis of the same set of nuclear parameters and same set of numerical techniques. Continuum cross sections in the energy region between 10 and several hundreds of MeV have successfully been analysed. Both angular distributions and energy spectra can be predicted in an essentially parameter-free manner. It is demonstrated that the quantum-mechanical MSD models (in particular the FKK model) give an improved prediction of pre-equilibrium angular distributions as compared to the experiment-based systematics of Kalbach. This makes KAPSIES a reliable tool for nuclear data applications in the afore-mentioned energy region. (author). 10 refs., 2 figs

  12. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  13. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  14. Rapid Calibration of High Resolution Geologic Models to Dynamic Data Using Inverse Modeling: Field Application and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Akhil Datta-Gupta

    2008-03-31

    Streamline-based assisted and automatic history matching techniques have shown great potential in reconciling high resolution geologic models to production data. However, a major drawback of these approaches has been incompressibility or slight compressibility assumptions that have limited applications to two-phase water-oil displacements only. We propose an approach to history matching three-phase flow using a novel compressible streamline formulation and streamline-derived analytic sensitivities. First, we utilize a generalized streamline model to account for compressible flow by introducing an 'effective density' of total fluids along streamlines. Second, we analytically compute parameter sensitivities that define the relationship between the reservoir properties and the production response, viz. water-cut and gas/oil ratio (GOR). These sensitivities are an integral part of history matching, and streamline models permit efficient computation of these sensitivities through a single flow simulation. We calibrate geologic models to production data by matching the water-cut and gas/oil ratio using our previously proposed generalized travel time inversion (GTTI) technique. For field applications, however, the highly non-monotonic profile of the gas/oil ratio data often presents a challenge to this technique. In this work we present a transformation of the field production data that makes it more amenable to GTTI. Further, we generalize the approach to incorporate bottom-hole flowing pressure during three-phase history matching. We examine the practical feasibility of the method using a field-scale synthetic example (SPE-9 comparative study) and a field application. Recently Ensemble Kalman Filtering (EnKF) has gained increased attention for history matching and continuous reservoir model updating using data from permanent downhole sensors. It is a sequential Monte-Carlo approach that works with an ensemble of reservoir models. Specifically, the method

  15. Application of the Aquifer Impact Model to support decisions at a CO 2 sequestration site: Modeling and Analysis: Application of the Aquifer Impact Model to support decisions at a CO 2

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Diana Holford [Pacific Northwest National Laboratory, Richland WA USA; Locke II, Randall A. [University of Illinois, Illinois State Geological Survey Champaign IL USA; Keating, Elizabeth [Los Alamos National Laboratory, Los Alamos NM USA; Carroll, Susan [Lawrence Livermore National Laboratory, Livermore CA USA; Iranmanesh, Abbas [University of Illinois, Illinois State Geological Survey Champaign IL USA; Mansoor, Kayyum [Lawrence Livermore National Laboratory, Livermore CA USA; Wimmer, Bracken [University of Illinois, Illinois State Geological Survey Champaign IL USA; Zheng, Liange [Lawrence Berkeley National Laboratory, Berkeley CA USA; Shao, Hongbo [University of Illinois, Illinois State Geological Survey Champaign IL USA; Greenberg, Sallie E. [University of Illinois, Illinois State Geological Survey Champaign IL USA

    2017-10-04

    The National Risk Assessment Partnership (NRAP) has developed a suite of tools to assess and manage risk at CO2 sequestration sites (1). The NRAP tool suite includes the Aquifer Impact Model (AIM), based on reduced order models developed using site-specific data from two aquifers (alluvium and carbonate). The models accept aquifer parameters as a range of variable inputs so they may have more broad applicability. Guidelines have been developed for determining the aquifer types for which the ROMs should be applicable. This paper considers the applicability of the aquifer models in AIM to predicting the impact of CO2 or Brine leakage were it to occur at the Illinois Basin Decatur Project (IBDP). Based on the results of the sensitivity analysis, the hydraulic parameters and leakage source term magnitude are more sensitive than clay fraction or cation exchange capacity. Sand permeability was the only hydraulic parameter measured at the IBDP site. More information on the other hydraulic parameters, such as sand fraction and sand/clay correlation lengths, could reduce uncertainty in risk estimates. Some non-adjustable parameters, such as the initial pH and TDS and the pH no-impact threshold, are significantly different for the ROM than for the observations at the IBDP site. The reduced order model could be made more useful to a wider range of sites if the initial conditions and no-impact threshold values were adjustable parameters.

  16. On the application of copula in modeling maintenance contract

    International Nuclear Information System (INIS)

    Iskandar, B P; Husniah, H

    2016-01-01

    This paper deals with the application of copula in maintenance contracts for a nonrepayable item. Failures of the item are modeled using a two dimensional approach where age and usage of the item and this requires a bi-variate distribution to modelling failures. When the item fails then corrective maintenance (CM) is minimally repaired. CM can be outsourced to an external agent or done in house. The decision problem for the owner is to find the maximum total profit whilst for the agent is to determine the optimal price of the contract. We obtain the mathematical models of the decision problems for the owner as well as the agent using a Nash game theory formulation. (paper)

  17. Modeling survival: application of the Andersen-Gill model to Yellowstone grizzly bears

    Science.gov (United States)

    Johnson, Christopher J.; Boyce, Mark S.; Schwartz, Charles C.; Haroldson, Mark A.

    2004-01-01

     Wildlife ecologists often use the Kaplan-Meier procedure or Cox proportional hazards model to estimate survival rates, distributions, and magnitude of risk factors. The Andersen-Gill formulation (A-G) of the Cox proportional hazards model has seen limited application to mark-resight data but has a number of advantages, including the ability to accommodate left-censored data, time-varying covariates, multiple events, and discontinuous intervals of risks. We introduce the A-G model including structure of data, interpretation of results, and assessment of assumptions. We then apply the model to 22 years of radiotelemetry data for grizzly bears (Ursus arctos) of the Greater Yellowstone Grizzly Bear Recovery Zone in Montana, Idaho, and Wyoming, USA. We used Akaike's Information Criterion (AICc) and multi-model inference to assess a number of potentially useful predictive models relative to explanatory covariates for demography, human disturbance, and habitat. Using the most parsimonious models, we generated risk ratios, hypothetical survival curves, and a map of the spatial distribution of high-risk areas across the recovery zone. Our results were in agreement with past studies of mortality factors for Yellowstone grizzly bears. Holding other covariates constant, mortality was highest for bears that were subjected to repeated management actions and inhabited areas with high road densities outside Yellowstone National Park. Hazard models developed with covariates descriptive of foraging habitats were not the most parsimonious, but they suggested that high-elevation areas offered lower risks of mortality when compared to agricultural areas.

  18. 76 FR 29249 - Medicare Program; Pioneer Accountable Care Organization Model: Request for Applications

    Science.gov (United States)

    2011-05-20

    ... Affordable Care Act, to test innovative payment and service delivery models that reduce spending under.... This Model will test the effectiveness of a combination of the following: Payment arrangements that...] Medicare Program; Pioneer Accountable Care Organization Model: Request for Applications AGENCY: Centers for...

  19. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  20. A Practical Model for Forecasting New Freshman Enrollment during the Application Period.

    Science.gov (United States)

    Paulsen, Michael B.

    1989-01-01

    A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)

  1. Application of Metamodels to Identification of Metallic Materials Models

    Directory of Open Access Journals (Sweden)

    Maciej Pietrzyk

    2016-01-01

    Full Text Available Improvement of the efficiency of the inverse analysis (IA for various material tests was the objective of the paper. Flow stress models and microstructure evolution models of various complexity of mathematical formulation were considered. Different types of experiments were performed and the results were used for the identification of models. Sensitivity analysis was performed for all the models and the importance of parameters in these models was evaluated. Metamodels based on artificial neural network were proposed to simulate experiments in the inverse solution. Performed analysis has shown that significant decrease of the computing times could be achieved when metamodels substitute finite element model in the inverse analysis, which is the case in the identification of flow stress models. Application of metamodels gave good results for flow stress models based on closed form equations accounting for an influence of temperature, strain, and strain rate (4 coefficients and additionally for softening due to recrystallization (5 coefficients and for softening and saturation (7 coefficients. Good accuracy and high efficiency of the IA were confirmed. On the contrary, identification of microstructure evolution models, including phase transformation models, did not give noticeable reduction of the computing time.

  2. Finite element modelling of the foot for clinical application: A systematic review.

    Science.gov (United States)

    Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan

    2017-01-01

    Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  3. Numerical modeling of friction welding of bi-metal joints for electrical applications

    Science.gov (United States)

    Velu, P. Shenbaga; Hynes, N. Rajesh Jesudoss

    2018-05-01

    In the manufacturing industries, and more especially in electrical engineering applications, the usage of non-ferrous materials plays a vital role. Today's engineering applications relies upon some of the significant properties such as a good corrosion resistance, mechanical properties, good heat conductivity and higher electrical conductivity. Copper-aluminum bi-metal joint is one such combination that meets the demands requirements for electrical applications. In this work, the numerical simulation of AA 6061 T6 alloy/Copper was carried out under joining conditions. By using this developed model, the temperature distribution along the length of the dissimilar joint is predicted and the time-temperature profile has also been generated. Besides, a Finite Element Model has been developed by using the numerical simulation Tool "ABAQUS". This developed FEM is helpful in predicting various output parameters during friction welding of this dissimilar joint combination.

  4. Computational Modeling of Human Metabolism and Its Application to Systems Biomedicine.

    Science.gov (United States)

    Aurich, Maike K; Thiele, Ines

    2016-01-01

    Modern high-throughput techniques offer immense opportunities to investigate whole-systems behavior, such as those underlying human diseases. However, the complexity of the data presents challenges in interpretation, and new avenues are needed to address the complexity of both diseases and data. Constraint-based modeling is one formalism applied in systems biology. It relies on a genome-scale reconstruction that captures extensive biochemical knowledge regarding an organism. The human genome-scale metabolic reconstruction is increasingly used to understand normal cellular and disease states because metabolism is an important factor in many human diseases. The application of human genome-scale reconstruction ranges from mere querying of the model as a knowledge base to studies that take advantage of the model's topology and, most notably, to functional predictions based on cell- and condition-specific metabolic models built based on omics data.An increasing number and diversity of biomedical questions are being addressed using constraint-based modeling and metabolic models. One of the most successful biomedical applications to date is cancer metabolism, but constraint-based modeling also holds great potential for inborn errors of metabolism or obesity. In addition, it offers great prospects for individualized approaches to diagnostics and the design of disease prevention and intervention strategies. Metabolic models support this endeavor by providing easy access to complex high-throughput datasets. Personalized metabolic models have been introduced. Finally, constraint-based modeling can be used to model whole-body metabolism, which will enable the elucidation of metabolic interactions between organs and disturbances of these interactions as either causes or consequence of metabolic diseases. This chapter introduces constraint-based modeling and describes some of its contributions to systems biomedicine.

  5. Dynamic modeling of brushless dc motor-power conditioner unit for electromechanical actuator application

    Science.gov (United States)

    Demerdash, N. A.; Nehl, T. W.

    1979-01-01

    A comprehensive digital model for the analysis of the dynamic-instantaneous performance of a power conditioner fed samarium-cobalt permanent magnet brushless DC motor is presented. The particular power conditioner-machine system at hand, for which this model was developed, is a component of an actual prototype electromechanical actuator built for NASA-JSC as a possible alternative to hydraulic actuators as part of feasibility studies for the shuttle orbiter applications. Excellent correlation between digital simulated and experimentally obtained performance data was achieved for this specific prototype. This is reported on in this paper. Details of one component of the model, its applications and the corresponding results are given in this paper.

  6. A mathematical model for interpretable clinical decision support with applications in gynecology.

    Directory of Open Access Journals (Sweden)

    Vanya M C A Van Belle

    Full Text Available Over time, methods for the development of clinical decision support (CDS systems have evolved from interpretable and easy-to-use scoring systems to very complex and non-interpretable mathematical models. In order to accomplish effective decision support, CDS systems should provide information on how the model arrives at a certain decision. To address the issue of incompatibility between performance, interpretability and applicability of CDS systems, this paper proposes an innovative model structure, automatically leading to interpretable and easily applicable models. The resulting models can be used to guide clinicians when deciding upon the appropriate treatment, estimating patient-specific risks and to improve communication with patients.We propose the interval coded scoring (ICS system, which imposes that the effect of each variable on the estimated risk is constant within consecutive intervals. The number and position of the intervals are automatically obtained by solving an optimization problem, which additionally performs variable selection. The resulting model can be visualised by means of appealing scoring tables and color bars. ICS models can be used within software packages, in smartphone applications, or on paper, which is particularly useful for bedside medicine and home-monitoring. The ICS approach is illustrated on two gynecological problems: diagnosis of malignancy of ovarian tumors using a dataset containing 3,511 patients, and prediction of first trimester viability of pregnancies using a dataset of 1,435 women. Comparison of the performance of the ICS approach with a range of prediction models proposed in the literature illustrates the ability of ICS to combine optimal performance with the interpretability of simple scoring systems.The ICS approach can improve patient-clinician communication and will provide additional insights in the importance and influence of available variables. Future challenges include extensions of the

  7. Application of two forest succession models at sites in Northeast Germany

    International Nuclear Information System (INIS)

    Lasch, P.; Lindner, M.

    1995-06-01

    In order to simulate potential impacts of climate change on forests, two succession models were applied to sites in the Northeast German lowlands. The models, which had been developed for Alpine (FORECE) and Boreal (FORSKA) forests differ from each other in the way they represent tree growth processes and the impact of environmental factors on establishment and growth. Both models were adjusted and compared with each other at sites that are situated along an ecological gradient from maritime to subcontinental climate. These sites are extending the former environmental space of model application towards water limited conditions, which under a predicted climatic change may have increasing importance for European forests. First results showed that FORECE was unrealistically sensitive to changes in soil moisture. On the other hand, FORSKA generally simulated very low biomasses. Since the structure of FORSKA seemed to be better suited for the simulation of changing environmental conditions, this model was chosen for further model development, applications and sensitivity analyses. Among other changes, establishment rates were increased and some environmental response factors were analysed. The function of account for resource depletion was modified. After the modifications for Central European conditions were made, there was a decrease in performance for the Boreal site. Both simulated total biomasses and species composition had changed. We conclude, that with currently available models, realistic forest dynamics within different climatic zones of Europe cannot be simulated without more substantial model modifications. (orig.)

  8. Applicability of western chemical dietary exposure models to the Chinese population.

    Science.gov (United States)

    Zhao, Shizhen; Price, Oliver; Liu, Zhengtao; Jones, Kevin C; Sweetman, Andrew J

    2015-07-01

    A range of exposure models, which have been developed in Europe and North America, are playing an increasingly important role in priority setting and the risk assessment of chemicals. However, the applicability of these tools, which are based on Western dietary exposure pathways, to estimate chemical exposure to the Chinese population to support the development of a risk-based environment and exposure assessment, is unclear. Three frequently used modelling tools, EUSES, RAIDAR and ACC-HUMANsteady, have been evaluated in terms of human dietary exposure estimation by application to a range of chemicals with different physicochemical properties under both model default and Chinese dietary scenarios. Hence, the modelling approaches were assessed by considering dietary pattern differences only. The predicted dietary exposure pathways were compared under both scenarios using a range of hypothetical and current emerging contaminants. Although the differences across models are greater than those between dietary scenarios, model predictions indicated that dietary preference can have a significant impact on human exposure, with the relatively high consumption of vegetables and cereals resulting in higher exposure via plants-based foodstuffs under Chinese consumption patterns compared to Western diets. The selected models demonstrated a good ability to identify key dietary exposure pathways which can be used for screening purposes and an evaluative risk assessment. However, some model adaptations will be required to cover a number of important Chinese exposure pathways, such as freshwater farmed-fish, grains and pork. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Applications of the SWAT Model Special Section: Overview and Insights.

    Science.gov (United States)

    Gassman, Philip W; Sadeghi, Ali M; Srinivasan, Raghavan

    2014-01-01

    The Soil and Water Assessment Tool (SWAT) model has emerged as one of the most widely used water quality watershed- and river basin-scale models worldwide, applied extensively for a broad range of hydrologic and/or environmental problems. The international use of SWAT can be attributed to its flexibility in addressing water resource problems, extensive networking via dozens of training workshops and the several international conferences that have been held during the past decade, comprehensive online documentation and supporting software, and an open source code that can be adapted by model users for specific application needs. The catalyst for this special collection of papers was the 2011 International SWAT Conference & Workshops held in Toledo, Spain, which featured over 160 scientific presentations representing SWAT applications in 37 countries. This special collection presents 22 specific SWAT-related studies, most of which were presented at the 2011 SWAT Conference; it represents SWAT applications on five different continents, with the majority of studies being conducted in Europe and North America. The papers cover a variety of topics, including hydrologic testing at a wide range of watershed scales, transport of pollutants in northern European lowland watersheds, data input and routing method effects on sediment transport, development and testing of potential new model algorithms, and description and testing of supporting software. In this introduction to the special section, we provide a synthesis of these studies within four main categories: (i) hydrologic foundations, (ii) sediment transport and routing analyses, (iii) nutrient and pesticide transport, and (iv) scenario analyses. We conclude with a brief summary of key SWAT research and development needs. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  10. Human eye modelling for ophthalmic simulators project for clinic applications

    International Nuclear Information System (INIS)

    Sanchez, Andrea; Santos, Adimir dos; Yoriyaz, Helio

    2002-01-01

    Most of eye tumors are treated by surgical means, which involves the enucleation of affected eyes. In terms of treatment and control of diseases, there is brachytherapy, which often utilizes small applicator of Co-60, I-125, Ru-106, Ir-192, etc. These methods are shown to be very efficient but highly cost. The objective of this work is propose a detailed simulator modelling for eye characterization. Additionally, this study can contribute to design and build a new applicator in order to reduce the cost and to allow more patients to be treated

  11. Mathematical and computational modeling with applications in natural and social sciences, engineering, and the arts

    CERN Document Server

    Melnik, Roderick

    2015-01-01

    Illustrates the application of mathematical and computational modeling in a variety of disciplines With an emphasis on the interdisciplinary nature of mathematical and computational modeling, Mathematical and Computational Modeling: With Applications in the Natural and Social Sciences, Engineering, and the Arts features chapters written by well-known, international experts in these fields and presents readers with a host of state-of-the-art achievements in the development of mathematical modeling and computational experiment methodology. The book is a valuable guide to the methods, ideas,

  12. 76 FR 46330 - NUREG-1934, Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG); Second Draft...

    Science.gov (United States)

    2011-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0568] NUREG-1934, Nuclear Power Plant Fire Modeling... 1023259), ``Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG), Second Draft Report for...), ``Nuclear Power Plant Fire Modeling Application Guide (NPP FIRE MAG), Second Draft for Comment,'' is...

  13. GRace: a MATLAB-based application for fitting the discrimination-association model.

    Science.gov (United States)

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  14. Reversible polymorphism-aware phylogenetic models and their application to tree inference.

    Science.gov (United States)

    Schrempf, Dominik; Minh, Bui Quang; De Maio, Nicola; von Haeseler, Arndt; Kosiol, Carolin

    2016-10-21

    We present a reversible Polymorphism-Aware Phylogenetic Model (revPoMo) for species tree estimation from genome-wide data. revPoMo enables the reconstruction of large scale species trees for many within-species samples. It expands the alphabet of DNA substitution models to include polymorphic states, thereby, naturally accounting for incomplete lineage sorting. We implemented revPoMo in the maximum likelihood software IQ-TREE. A simulation study and an application to great apes data show that the runtimes of our approach and standard substitution models are comparable but that revPoMo has much better accuracy in estimating trees, divergence times and mutation rates. The advantage of revPoMo is that an increase of sample size per species improves estimations but does not increase runtime. Therefore, revPoMo is a valuable tool with several applications, from speciation dating to species tree reconstruction. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. [Application of Land-use Regression Models in Spatial-temporal Differentiation of Air Pollution].

    Science.gov (United States)

    Wu, Jian-sheng; Xie, Wu-dan; Li, Jia-cheng

    2016-02-15

    With the rapid development of urbanization, industrialization and motorization, air pollution has become one of the most serious environmental problems in our country, which has negative impacts on public health and ecological environment. LUR model is one of the common methods simulating spatial-temporal differentiation of air pollution at city scale. It has broad application in Europe and North America, but not really in China. Based on many studies at home and abroad, this study started with the main steps to develop LUR model, including obtaining the monitoring data, generating variables, developing models, model validation and regression mapping. Then a conclusion was drawn on the progress of LUR models in spatial-temporal differentiation of air pollution. Furthermore, the research focus and orientation in the future were prospected, including highlighting spatial-temporal differentiation, increasing classes of model variables and improving the methods of model development. This paper was aimed to popularize the application of LUR model in China, and provide a methodological basis for human exposure, epidemiologic study and health risk assessment.

  16. Improved dual sided doped memristor: modelling and applications

    Directory of Open Access Journals (Sweden)

    Anup Shrivastava

    2014-05-01

    Full Text Available Memristor as a novel and emerging electronic device having vast range of applications suffer from poor frequency response and saturation length. In this paper, the authors present a novel and an innovative device structure for the memristor with two active layers and its non-linear ionic drift model for an improved frequency response and saturation length. The authors investigated and compared the I–V characteristics for the proposed model with the conventional memristors and found better results in each case (different window functions for the proposed dual sided doped memristor. For circuit level simulation, they developed a SPICE model of the proposed memristor and designed some logic gates based on hybrid complementary metal oxide semiconductor memristive logic (memristor ratioed logic. The proposed memristor yields improved results in terms of noise margin, delay time and dynamic hazards than that of the conventional memristors (single active layer memristors.

  17. Explicit Nonlinear Model Predictive Control Theory and Applications

    CERN Document Server

    Grancharova, Alexandra

    2012-01-01

    Nonlinear Model Predictive Control (NMPC) has become the accepted methodology to solve complex control problems related to process industries. The main motivation behind explicit NMPC is that an explicit state feedback law avoids the need for executing a numerical optimization algorithm in real time. The benefits of an explicit solution, in addition to the efficient on-line computations, include also verifiability of the implementation and the possibility to design embedded control systems with low software and hardware complexity. This book considers the multi-parametric Nonlinear Programming (mp-NLP) approaches to explicit approximate NMPC of constrained nonlinear systems, developed by the authors, as well as their applications to various NMPC problem formulations and several case studies. The following types of nonlinear systems are considered, resulting in different NMPC problem formulations: Ø  Nonlinear systems described by first-principles models and nonlinear systems described by black-box models; �...

  18. Powder consolidation using cold spray process modeling and emerging applications

    CERN Document Server

    Moridi, Atieh

    2017-01-01

    This book first presents different approaches to modeling of the cold spray process with the aim of extending current understanding of its fundamental principles and then describes emerging applications of cold spray. In the coverage of modeling, careful attention is devoted to the assessment of critical and erosion velocities. In order to reveal the phenomenological characteristics of interface bonding, severe, localized plastic deformation and material jet formation are studied. Detailed consideration is also given to the effect of macroscopic defects such as interparticle boundaries and subsequent splat boundary cracking on the mechanical behavior of cold spray coatings. The discussion of applications focuses in particular on the repair of damaged parts and additive manufacturing in various disciplines from aerospace to biomedical engineering. Key aspects include a systematic study of defect shape and the ability of cold spray to fill the defect, examination of the fatigue behavior of coatings for structur...

  19. MODELLING AND SIMULATION OF HIGH FREQUENCY INVERTER FOR INDUCTION HEATING APPLICATION

    OpenAIRE

    SACHIN S. BANKAR; Dr. PRASAD M. JOSHI

    2016-01-01

    This paper presents modelling and simulation of high frequency inverter for induction heating applications. Induction heating has advantages like higher efficiency, controlled heating, safety and pollution free therefore this technology is used in industrial, domestic and medical applications. The high frequency full bridge inverter is used for induction heating, also MOSFET is used as a switching device for inverter and the control strategy used for inverter is Bipolar PWM control. The size ...

  20. Dynamic fuel cell models and their application in hardware in the loop simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lemes, Zijad; Maencher, H. [MAGNUM Automatisierungstechnik GmbH, Bunsenstr. 22, D-64293 Darmstadt (Germany); Vath, Andreas; Hartkopf, Th. [Technische Universitaet Darmstadt/Institut fuer Elektrische Energiewandlung, Landgraf-Georg-Str. 4, D-64283 Darmstadt (Germany)

    2006-03-21

    Currently, fuel cell technology plays an important role in the development of alternative energy converters for mobile, portable and stationary applications. With the help of physical based models of fuel cell systems and appropriate test benches it is possible to design different applications and investigate their stationary and dynamic behaviour. The polymer electrolyte membrane (PEM) fuel cell system model includes gas humidifier, air and hydrogen supply, current converter and a detailed stack model incorporating the physical characteristics of the different layers. In particular, the use of these models together with hardware in the loop (HIL) capable test stands helps to decrease the costs and accelerate the development of fuel cell systems. The interface program provides fast data exchange between the test bench and the physical model of the fuel cell or any other systems in real time. So the flexibility and efficiency of the test bench increase fundamentally, because it is possible to replace real components with their mathematical models. (author)

  1. Modeling and experimental validation of a Hybridized Energy Storage System for automotive applications

    Science.gov (United States)

    Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona

    2013-11-01

    This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.

  2. [Application of three compartment model and response surface model to clinical anesthesia using Microsoft Excel].

    Science.gov (United States)

    Abe, Eiji; Abe, Mari

    2011-08-01

    With the spread of total intravenous anesthesia, clinical pharmacology has become more important. We report Microsoft Excel file applying three compartment model and response surface model to clinical anesthesia. On the Microsoft Excel sheet, propofol, remifentanil and fentanyl effect-site concentrations are predicted (three compartment model), and probabilities of no response to prodding, shaking, surrogates of painful stimuli and laryngoscopy are calculated using predicted effect-site drug concentration. Time-dependent changes in these calculated values are shown graphically. Recent development in anesthetic drug interaction studies are remarkable, and its application to clinical anesthesia with this Excel file is simple and helpful for clinical anesthesia.

  3. Application of mechanistic models to fermentation and biocatalysis for next-generation processes

    DEFF Research Database (Denmark)

    Gernaey, Krist; Eliasson Lantz, Anna; Tufvesson, Pär

    2010-01-01

    of variables required for measurement, control and process design. In the near future, mechanistic models with a higher degree of detail will play key roles in the development of efficient next-generation fermentation and biocatalytic processes. Moreover, mechanistic models will be used increasingly......Mechanistic models are based on deterministic principles, and recently, interest in them has grown substantially. Herein we present an overview of mechanistic models and their applications in biotechnology, including future perspectives. Model utility is highlighted with respect to selection...

  4. Application of online modeling to the operation of SLC

    International Nuclear Information System (INIS)

    Woodley, M.D.; Sanchez-Chopitea, L.; Shoaee, H.

    1987-02-01

    Online computer models of first order beam optics have been developed for the commissioning, control and operation of the entire SLC including Damping Rings, Linac, Positron Return Line and Collider Arcs. A generalized online environment utilizing these models provides the capability for interactive selection of a desired optics configuration and for the study of its properties. Automated procedures have been developed which calculate and load beamline component set-points and which can scale magnet strengths to achieve desired beam properties for any Linac energy profile. Graphic displays facilitate comparison of design, desired and actual optical characteristics of the beamlines. Measured beam properties, such as beam emittance and dispersion, can be incorporated interactively into the models and used for beamline matching and optimization of injection and extraction efficiencies and beam transmission. The online optics modeling facility also serves as the foundation for many model-driven applications such as autosteering, calculation of beam launch parameters, emittance measurement and dispersion correction

  5. Application of online modeling to the operation of SLC

    International Nuclear Information System (INIS)

    Woodley, M.D.; Sanchez-Chopitea, L.; Shoaee, H.

    1987-01-01

    Online computer models of first order beam optics have been developed for the commissioning, control and operation of the entire SLC including Damping Rings, Linac, Positron Return Line and Collider Arcs. A generalized online environment utilizing these models provides the capability for interactive selection of a desire optics configuration and for the study of its properties. Automated procedures have been developed which calculate and load beamline component set-points and which can scale magnet strengths to achieve desired beam properties for any Linac energy profile. Graphic displays facilitate comparison of design, desired and actual optical characteristics of the beamlines. Measured beam properties, such as beam emittance and dispersion, can be incorporated interactively into the models and used for beam matching and optimization of injection and extraction efficiencies and beam transmissions. The online optics modeling facility also serves as the foundation for many model-driven applications such as autosteering, calculation of beam launch parameters, emittance measurement and dispersion correction

  6. Structural Equation Modeling: Theory and Applications in Forest Management

    Directory of Open Access Journals (Sweden)

    Tzeng Yih Lam

    2012-01-01

    Full Text Available Forest ecosystem dynamics are driven by a complex array of simultaneous cause-and-effect relationships. Understanding this complex web requires specialized analytical techniques such as Structural Equation Modeling (SEM. The SEM framework and implementation steps are outlined in this study, and we then demonstrate the technique by application to overstory-understory relationships in mature Douglas-fir forests in the northwestern USA. A SEM model was formulated with (1 a path model representing the effects of successively higher layers of vegetation on late-seral herbs through processes such as light attenuation and (2 a measurement model accounting for measurement errors. The fitted SEM model suggested a direct negative effect of light attenuation on late-seral herbs cover but a direct positive effect of northern aspect. Moreover, many processes have indirect effects mediated through midstory vegetation. SEM is recommended as a forest management tool for designing silvicultural treatments and systems for attaining complex arrays of management objectives.

  7. A resilience-based model for performance evaluation of information systems: the case of a gas company

    Science.gov (United States)

    Azadeh, A.; Salehi, V.; Salehi, R.

    2017-10-01

    Information systems (IS) are strongly influenced by changes in new technology and should react swiftly in response to external conditions. Resilience engineering is a new method that can enable these systems to absorb changes. In this study, a new framework is presented for performance evaluation of IS that includes DeLone and McLean's factors of success in addition to resilience. Hence, this study is an attempt to evaluate the impact of resilience on IS by the proposed model in Iranian Gas Engineering and Development Company via the data obtained from questionnaires and Fuzzy Data Envelopment Analysis (FDEA) approach. First, FDEA model with α-cut = 0.05 was identified as the most suitable model to this application by performing all Banker, Charnes and Cooper and Charnes, Cooper and Rhodes models of and FDEA and selecting the appropriate model based on maximum mean efficiency. Then, the factors were ranked based on the results of sensitivity analysis, which showed resilience had a significantly higher impact on the proposed model relative to other factors. The results of this study were then verified by conducting the related ANOVA test. This is the first study that examines the impact of resilience on IS by statistical and mathematical approaches.

  8. Rectangular amplitudes, conformal blocks, and applications to loop models

    Energy Technology Data Exchange (ETDEWEB)

    Bondesan, Roberto, E-mail: roberto.bondesan@cea.fr [LPTENS, Ecole Normale Superieure, 24 rue Lhomond, 75231 Paris (France); Institute de Physique Theorique, CEA Saclay, F-91191 Gif-sur-Yvette (France); Jacobsen, Jesper L. [LPTENS, Ecole Normale Superieure, 24 rue Lhomond, 75231 Paris (France); Universite Pierre et Marie Curie, 4 place Jussieu, 75252 Paris (France); Saleur, Hubert [Institute de Physique Theorique, CEA Saclay, F-91191 Gif-sur-Yvette (France); Physics Department, USC, Los Angeles, CA 90089-0484 (United States)

    2013-02-21

    In this paper we continue the investigation of partition functions of critical systems on a rectangle initiated in [R. Bondesan, et al., Nucl. Phys. B 862 (2012) 553-575]. Here we develop a general formalism of rectangle boundary states using conformal field theory, adapted to describe geometries supporting different boundary conditions. We discuss the computation of rectangular amplitudes and their modular properties, presenting explicit results for the case of free theories. In a second part of the paper we focus on applications to loop models, discussing in details lattice discretizations using both numerical and analytical calculations. These results allow to interpret geometrically conformal blocks, and as an application we derive new probability formulas for self-avoiding walks.

  9. Materials modelling - a possible design tool for advanced nuclear applications

    International Nuclear Information System (INIS)

    Hoffelner, W.; Samaras, M.; Bako, B.; Iglesias, R.

    2008-01-01

    The design of components for power plants is usually based on codes, standards and design rules or code cases. However, it is very difficult to get the necessary experimental data to prove these lifetime assessment procedures for long-term applications in environments where complex damage interactions (temperature, stress, environment, irradiation) can occur. The rules used are often very simple and do not have a basis which take physical damage into consideration. The linear life fraction rule for creep and fatigue interaction can be taken as a prominent example. Materials modelling based on a multi-scale approach in principle provides a tool to convert microstructural findings into mechanical response and therefore has the capability of providing a set of tools for the improvement of design life assessments. The strength of current multi-scale modelling efforts is the insight they offer as regards experimental phenomena. To obtain an understanding of these phenomena it is import to focus on issues which are important at the various time and length scales of the modelling code. In this presentation the multi-scale path will be demonstrated with a few recent examples which focus on VHTR applications. (authors)

  10. Application of Z-Number Based Modeling in Psychological Research

    Directory of Open Access Journals (Sweden)

    Rafik Aliev

    2015-01-01

    Full Text Available Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS, Test of Attention (D2 Test, and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  11. A review of visual MODFLOW applications in groundwater modelling

    Science.gov (United States)

    Hariharan, V.; Shankar, M. Uma

    2017-11-01

    Visual MODLOW is a Graphical User Interface for the USGS MODFLOW. It is a commercial software that is popular among the hydrogeologists for its user-friendly features. The software is mainly used for Groundwater flow and contaminant transport models under different conditions. This article is intended to review the versatility of its applications in groundwater modelling for the last 22 years. Agriculture, airfields, constructed wetlands, climate change, drought studies, Environmental Impact Assessment (EIA), landfills, mining operations, river and flood plain monitoring, salt water intrusion, soil profile surveys, watershed analyses, etc., are the areas where the software has been reportedly used till the current date. The review will provide a clarity on the scope of the software in groundwater modelling and research.

  12. Crop model usefulness in drylands of southern Africa: an application ...

    African Journals Online (AJOL)

    Data limitations in southern Africa frequently hinder adequate assessment of crop models before application. ... three locations to represent varying cropping and physical conditions in southern Africa, i.e. maize and sorghum (Mohale's Hoek, Lesotho and Big Bend, Swaziland) and maize and groundnut (Lilongwe, Malawi).

  13. Modelling accidental releases of tritium in the environment: application as an excel spreadsheet

    International Nuclear Information System (INIS)

    Le Dizes, S.; Tamponnet, C.

    2004-01-01

    An application as an Excel spreadsheet of the simplified modelling approach of tritium transfer in the environment developed by Tamponnet (2002) is presented. Based on the use of growth models of biological systems (plants, animals, etc.), the two-pool model (organic tritium and tritiated water) that was developed estimates the concentration of tritium within the different compartments of the food chain and in fine the dose to man by ingestion in the case of a chronic or accidental release of tritium in a river or the atmosphere. Data and knowledge have been implemented on Excel using the object-oriented programming language VisualBasic (Microsoft Visual Basic 6.0). The structure of the conceptual model and the Excel sheet are first briefly exposed. A numerical application of the model under a scenario of an accidental release of tritium in the atmosphere is then presented. Simulation results and perspectives are discussed. (author)

  14. Language Modelling for Collaborative Filtering: Application to Job Applicant Matching

    OpenAIRE

    Schmitt , Thomas; Gonard , François; Caillou , Philippe; Sebag , Michèle

    2017-01-01

    International audience; This paper addresses a collaborative retrieval problem , the recommendation of job ads to applicants. Specifically, two proprietary databases are considered. The first one focuses on the context of unskilled low-paid jobs/applicants; the second one focuses on highly qualified jobs/applicants. Each database includes the job ads and applicant resumes together with the collaborative filtering data recording the applicant clicks on job ads. The proposed approach, called LA...

  15. Computational spectrotemporal auditory model with applications to acoustical information processing

    Science.gov (United States)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  16. Overview on available animal models for application in leukemia research

    International Nuclear Information System (INIS)

    Borkhardt, A.; Sanchez-Garcia, I.; Cobaleda, C.; Hauer, J.

    2015-01-01

    The term ''leukemia'' encompasses a group of diseases with a variable clinical and pathological presentation. Its cellular origin, its biology and the underlying molecular genetic alterations determine the very variable and individual disease phenotype. The focus of this review is to discuss the most important guidelines to be taken into account when we aim at developing an ''ideal'' animal model to study leukemia. The animal model should mimic all the clinical, histological and molecular genetic characteristics of the human phenotype and should be applicable as a clinically predictive model. It should achieve all the requirements to be used as a standardized model adaptive to basic research as well as to pharmaceutical practice. Furthermore it should fulfill all the criteria to investigate environmental risk factors, the role of genomic mutations and be applicable for therapeutic testing. These constraints limit the usefulness of some existing animal models, which are however very valuable for basic research. Hence in this review we will primarily focus on genetically engineered mouse models (GEMMs) to study the most frequent types of childhood leukemia. GEMMs are robust models with relatively low site specific variability and which can, with the help of the latest gene modulating tools be adapted to individual clinical and research questions. Moreover they offer the possibility to restrict oncogene expression to a defined target population and regulate its expression level as well as its timely activity. Until recently it was only possible in individual cases to develop a murin model, which fulfills the above mentioned requirements. Hence the development of new regulatory elements to control targeted oncogene expression should be priority. Tightly controlled and cell specific oncogene expression can then be combined with a knock-in approach and will depict a robust murine model, which enables almost physiologic oncogene

  17. Application of heuristic and machine-learning approach to engine model calibration

    Science.gov (United States)

    Cheng, Jie; Ryu, Kwang R.; Newman, C. E.; Davis, George C.

    1993-03-01

    Automation of engine model calibration procedures is a very challenging task because (1) the calibration process searches for a goal state in a huge, continuous state space, (2) calibration is often a lengthy and frustrating task because of complicated mutual interference among the target parameters, and (3) the calibration problem is heuristic by nature, and often heuristic knowledge for constraining a search cannot be easily acquired from domain experts. A combined heuristic and machine learning approach has, therefore, been adopted to improve the efficiency of model calibration. We developed an intelligent calibration program called ICALIB. It has been used on a daily basis for engine model applications, and has reduced the time required for model calibrations from many hours to a few minutes on average. In this paper, we describe the heuristic control strategies employed in ICALIB such as a hill-climbing search based on a state distance estimation function, incremental problem solution refinement by using a dynamic tolerance window, and calibration target parameter ordering for guiding the search. In addition, we present the application of a machine learning program called GID3* for automatic acquisition of heuristic rules for ordering target parameters.

  18. Ozone modeling within plasmas for ozone sensor applications

    OpenAIRE

    Arshak, Khalil; Forde, Edward; Guiney, Ivor

    2007-01-01

    peer-reviewed Ozone (03) is potentially hazardous to human health and accurate prediction and measurement of this gas is essential in addressing its associated health risks. This paper presents theory to predict the levels of ozone concentration emittedfrom a dielectric barrier discharge (DBD) plasma for ozone sensing applications. This is done by postulating the kinetic model for ozone generation, with a DBD plasma at atmospheric pressure in air, in the form of a set of rate equations....

  19. JUPITER: Joint Universal Parameter IdenTification and Evaluation of Reliability - An Application Programming Interface (API) for Model Analysis

    Science.gov (United States)

    Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.

    2006-01-01

    he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model

  20. Application of a qualified RETRAN model to plant transient evaluation support

    International Nuclear Information System (INIS)

    Sedano, P.G.; Mata, P.; Alcantud, F.; Serra, J.; Castrillo, F.

    1989-01-01

    This paper presents the applicability and usefulness of a complete and well qualified plant transient code and model to support in depth evaluation of anomalous plant transients. Analyses of several operational and abnormal transients that ocurred during the first three cycles of Cofrentes (BWR-6) NPP are presented. This application demonstrated the need of a very detailed and adjusted simulation of the control systems as well as the convenience of having as complete as possible data adquisition system. (orig.)