WorldWideScience

Sample records for anova models application

  1. Application of Anova to Pulse Detonation Engine Dynamic Performance Measurements

    Science.gov (United States)

    Chander, Subhash; Kumar, Rakesh; Sandhu, Manmohan; Jindal, TK

    2017-10-01

    Application of Anova to Pulse detonation engine dynamic performance measurement resulted in quantifying engine functionality during various operations. After evaluating the performance, techniques of improving it, were applied in multiple areas of relevant interest. This produced encouraging results and helped in upscaling efforts in a significant manner. The current paper deals in the details of anova implementation and systematic identification of key areas of improvements. The improvements were carried out after careful selection of plans in the engine, ground rig and instrumentation setup etc. It also yielded better reproducibility of performance and optimization of main subsystems of PDE. Further, this will also contribute to reduce the development cycle and trial complexity also, if researchers continue to extract concern areas to be addressed.

  2. Effect of fasting ramadan in diabetes control status - application of extensive diabetes education, serum creatinine with HbA1c statistical ANOVA and regression models to prevent hypoglycemia.

    Science.gov (United States)

    Aziz, Kamran M A

    2013-09-01

    Ramadan fasting is an obligatory duty for Muslims. Unique physiologic and metabolic changes occur during fasting which requires adjustments of diabetes medications. Although challenging, successful fasting can be accomplished if pre-Ramadan extensive education is provided to the patients. Current research was conducted to study effective Ramadan fasting with different OHAs/insulins without significant risk of hypoglycemia in terms of HbA1c reductions after Ramadan. ANOVA model was used to assess HbA1c levels among different education statuses. Serum creatinine was used to measure renal functions. Pre-Ramadan diabetes education with alteration of therapy and dosage adjustments for OHAs/insulin was done. Regression models for HbA1c before Ramadan with FBS before sunset were also synthesized as a tool to prevent hypoglycemia and successful Ramadan fasting in future. Out of 1046 patients, 998 patients fasted successfully without any episodes of hypoglycemia. 48 patients (4.58%) experienced hypoglycemia. Χ(2) Test for CRD/CKD with hypoglycemia was also significant (p-value Ramadan diabetes management. Some relevant patents are also outlined in this paper.

  3. Biomarker Detection in Association Studies: Modeling SNPs Simultaneously via Logistic ANOVA

    KAUST Repository

    Jung, Yoonsuh

    2014-10-02

    In genome-wide association studies, the primary task is to detect biomarkers in the form of Single Nucleotide Polymorphisms (SNPs) that have nontrivial associations with a disease phenotype and some other important clinical/environmental factors. However, the extremely large number of SNPs comparing to the sample size inhibits application of classical methods such as the multiple logistic regression. Currently the most commonly used approach is still to analyze one SNP at a time. In this paper, we propose to consider the genotypes of the SNPs simultaneously via a logistic analysis of variance (ANOVA) model, which expresses the logit transformed mean of SNP genotypes as the summation of the SNP effects, effects of the disease phenotype and/or other clinical variables, and the interaction effects. We use a reduced-rank representation of the interaction-effect matrix for dimensionality reduction, and employ the L 1-penalty in a penalized likelihood framework to filter out the SNPs that have no associations. We develop a Majorization-Minimization algorithm for computational implementation. In addition, we propose a modified BIC criterion to select the penalty parameters and determine the rank number. The proposed method is applied to a Multiple Sclerosis data set and simulated data sets and shows promise in biomarker detection.

  4. INFLUENCE OF TECHNOLOGICAL PARAMETERS ON AGROTEXTILES WATER ABSORBENCY USING ANOVA MODEL

    Directory of Open Access Journals (Sweden)

    LUPU Iuliana G.

    2016-05-01

    Full Text Available Agrotextiles are now days extensively being used in horticulture, farming and other agricultural activities. Agriculture and textiles are the largest industries in the world providing basic needs such as food and clothing. Agrotextiles plays a significant role to help control environment for crop protection, eliminate variations in climate, weather change and generate optimum condition for plant growth. Water absorptive capacity is a very important property of needle-punched nonwovens used as irrigation substrate in horticulture. Nonwovens used as watering substrate distribute water uniformly and act as slight water buffer owing to the absorbent capacity. The paper analyzes the influence of needling process parameters on water absorptive capacity of needle-punched nonwovens by using ANOVA model. The model allows the identification of optimal action parameters in a shorter time and with less material expenses than by experimental research. The frequency of needle board and needle depth penetration has been used as independent variables while the water absorptive capacity as dependent variable for ANOVA regression model. Based on employed ANOVA model we have established that there is a significant influence of needling parameters on water absorbent capacity. The higher of depth needle penetration and needle board frequency, the higher is the compactness of fabric. A less porous structure has a lower water absorptive capacity.

  5. Visualizing Experimental Designs for Balanced ANOVA Models using Lisp-Stat

    Directory of Open Access Journals (Sweden)

    Philip W. Iversen

    2004-12-01

    Full Text Available The structure, or Hasse, diagram described by Taylor and Hilton (1981, American Statistician provides a visual display of the relationships between factors for balanced complete experimental designs. Using the Hasse diagram, rules exist for determining the appropriate linear model, ANOVA table, expected means squares, and F-tests in the case of balanced designs. This procedure has been implemented in Lisp-Stat using a software representation of the experimental design. The user can interact with the Hasse diagram to add, change, or delete factors and see the effect on the proposed analysis. The system has potential uses in teaching and consulting.

  6. Kerf modelling in abrasive waterjet milling using evolutionary computation and ANOVA techniques

    Science.gov (United States)

    Alberdi, A.; Rivero, A.; Carrascal, A.; Lamikiz, A.

    2012-04-01

    Many researchers demonstrated the capability of Abrasive Waterjet (AWJ) technology for precision milling operations. However, the concurrence of several input parameters along with the stochastic nature of this technology leads to a complex process control, which requires a work focused in process modelling. This research work introduces a model to predict the kerf shape in AWJ slot milling in Aluminium 7075-T651 in terms of four important process parameters: the pressure, the abrasive flow rate, the stand-off distance and the traverse feed rate. A hybrid evolutionary approach was employed for kerf shape modelling. This technique allowed characterizing the profile through two parameters: the maximum cutting depth and the full width at half maximum. On the other hand, based on ANOVA and regression techniques, these two parameters were also modelled as a function of process parameters. Combination of both models resulted in an adequate strategy to predict the kerf shape for different machining conditions.

  7. An Adaptive ANOVA-based PCKF for High-Dimensional Nonlinear Inverse Modeling

    Energy Technology Data Exchange (ETDEWEB)

    LI, Weixuan; Lin, Guang; Zhang, Dongxiao

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect—except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos bases in the expansion helps to capture uncertainty more accurately but increases computational cost. Bases selection is particularly important for high-dimensional stochastic problems because the number of polynomial chaos bases required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE bases are pre-set based on users’ experience. Also, for sequential data assimilation problems, the bases kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE bases for different problems and automatically adjusts the number of bases in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm is tested with different examples and demonstrated great effectiveness in comparison with non-adaptive PCKF and En

  8. An adaptive ANOVA-based PCKF for high-dimensional nonlinear inverse modeling

    Energy Technology Data Exchange (ETDEWEB)

    Li, Weixuan, E-mail: weixuan.li@usc.edu [Sonny Astani Department of Civil and Environmental Engineering, University of Southern California, Los Angeles, CA 90089 (United States); Lin, Guang, E-mail: guang.lin@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Zhang, Dongxiao, E-mail: dxz@pku.edu.cn [Department of Energy and Resources Engineering, College of Engineering, Peking University, Beijing 100871 (China)

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect—except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos basis functions in the expansion helps to capture uncertainty more accurately but increases computational cost. Selection of basis functions is particularly important for high-dimensional stochastic problems because the number of polynomial chaos basis functions required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE basis functions are pre-set based on users' experience. Also, for sequential data assimilation problems, the basis functions kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE basis functions for different problems and automatically adjusts the number of basis functions in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm was tested with different examples and

  9. Detecting variable responses in time-series using repeated measures ANOVA: Application to physiologic challenges [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Paul M. Macey

    2016-07-01

    Full Text Available We present an approach to analyzing physiologic timetrends recorded during a stimulus by comparing means at each time point using repeated measures analysis of variance (RMANOVA. The approach allows temporal patterns to be examined without an a priori model of expected timing or pattern of response. The approach was originally applied to signals recorded from functional magnetic resonance imaging (fMRI volumes-of-interest (VOI during a physiologic challenge, but we have used the same technique to analyze continuous recordings of other physiological signals such as heart rate, breathing rate, and pulse oximetry. For fMRI, the method serves as a complement to whole-brain voxel-based analyses, and is useful for detecting complex responses within pre-determined brain regions, or as a post-hoc analysis of regions of interest identified by whole-brain assessments. We illustrate an implementation of the technique in the statistical software packages R and SAS. VOI timetrends are extracted from conventionally preprocessed fMRI images. A timetrend of average signal intensity across the VOI during the scanning period is calculated for each subject. The values are scaled relative to baseline periods, and time points are binned. In SAS, the procedure PROC MIXED implements the RMANOVA in a single step. In R, we present one option for implementing RMANOVA with the mixed model function “lme”. Model diagnostics, and predicted means and differences are best performed with additional libraries and commands in R; we present one example. The ensuing results allow determination of significant overall effects, and time-point specific within- and between-group responses relative to baseline. We illustrate the technique using fMRI data from two groups of subjects who underwent a respiratory challenge. RMANOVA allows insight into the timing of responses and response differences between groups, and so is suited to physiologic testing paradigms eliciting complex

  10. ANOVA and ANCOVA A GLM Approach

    CERN Document Server

    Rutherford, Andrew

    2012-01-01

    Provides an in-depth treatment of ANOVA and ANCOVA techniques from a linear model perspective ANOVA and ANCOVA: A GLM Approach provides a contemporary look at the general linear model (GLM) approach to the analysis of variance (ANOVA) of one- and two-factor psychological experiments. With its organized and comprehensive presentation, the book successfully guides readers through conventional statistical concepts and how to interpret them in GLM terms, treating the main single- and multi-factor designs as they relate to ANOVA and ANCOVA. The book begins with a brief history of the separate dev

  11. Violation of the Sphericity Assumption and Its Effect on Type-I Error Rates in Repeated Measures ANOVA and Multi-Level Linear Models (MLM

    Directory of Open Access Journals (Sweden)

    Nicolas Haverkamp

    2017-10-01

    Full Text Available We investigated the effects of violations of the sphericity assumption on Type I error rates for different methodical approaches of repeated measures analysis using a simulation approach. In contrast to previous simulation studies on this topic, up to nine measurement occasions were considered. Effects of the level of inter-correlations between measurement occasions on Type I error rates were considered for the first time. Two populations with non-violation of the sphericity assumption, one with uncorrelated measurement occasions and one with moderately correlated measurement occasions, were generated. One population with violation of the sphericity assumption combines uncorrelated with highly correlated measurement occasions. A second population with violation of the sphericity assumption combines moderately correlated and highly correlated measurement occasions. From these four populations without any between-group effect or within-subject effect 5,000 random samples were drawn. Finally, the mean Type I error rates for Multilevel linear models (MLM with an unstructured covariance matrix (MLM-UN, MLM with compound-symmetry (MLM-CS and for repeated measures analysis of variance (rANOVA models (without correction, with Greenhouse-Geisser-correction, and Huynh-Feldt-correction were computed. To examine the effect of both the sample size and the number of measurement occasions, sample sizes of n = 20, 40, 60, 80, and 100 were considered as well as measurement occasions of m = 3, 6, and 9. With respect to rANOVA, the results plead for a use of rANOVA with Huynh-Feldt-correction, especially when the sphericity assumption is violated, the sample size is rather small and the number of measurement occasions is large. For MLM-UN, the results illustrate a massive progressive bias for small sample sizes (n = 20 and m = 6 or more measurement occasions. This effect could not be found in previous simulation studies with a smaller number of measurement

  12. Application of Taguchi method and ANOVA in the optimization of dyeing process on cotton knit fabric to reduce re-dyeing process

    Science.gov (United States)

    Wahyudin; Kharisma, Angel; Murphiyanto, Richard Dimas Julian; Perdana, Muhammad Kevin; Pirdo Kasih, Tota

    2017-12-01

    In the textile industry, tons of dyes are lost to effluents every year during the dyeing and finishing operations, due to the inefficient processes. As the dyeing process produce tons of effluents, the re-dyeing process multiplies the number. The re-dyeing process will be done when the expected color not reached that caused by the improper setting of parameters. The waste of these processes could threaten the environment. In this paper, we utilize Taguchi methods and ANOVA to obtain the optimum conditions of a dyeing process at XYZ company and to gain the percentage of contributions of each parameter. To confirm the optimum conditions obtained by using the Taguchi Method, verification test was carried out to inspect the similarity between predicted output and five experiments under the optimal conditions and the result was confirmed. The optimum conditions for a dyeing process are dye concentration 3.5%; Na2SO4 concentration 80g/l; Na2CO3 concentration 5.8 g/l; and temperature at 80°C.

  13. Foliar Sprays of Citric Acid and Malic Acid Modify Growth, Flowering, and Root to Shoot Ratio of Gazania (Gazania rigens L.: A Comparative Analysis by ANOVA and Structural Equations Modeling

    Directory of Open Access Journals (Sweden)

    Majid Talebi

    2014-01-01

    Full Text Available Foliar application of two levels of citric acid and malic acid (100 or 300 mg L−1 was investigated on flower stem height, plant height, flower performance and yield indices (fresh yield, dry yield and root to shoot ratio of Gazania. Distilled water was applied as control treatment. Multivariate analysis revealed that while the experimental treatments had no significant effect on fresh weight and the flower count, the plant dry weight was significantly increased by 300 mg L−1 malic acid. Citric acid at 100 and 300 mg L−1 and 300 mg L−1 malic acid increased the root fresh weight significantly. Both the plant height and peduncle length were significantly increased in all applied levels of citric acid and malic acid. The display time of flowers on the plant increased in all treatments compared to control treatment. The root to shoot ratio was increased significantly in 300 mg L−1 citric acid compared to all other treatments. These findings confirm earlier reports that citric acid and malic acid as environmentally sound chemicals are effective on various aspects of growth and development of crops. Structural equations modeling is used in parallel to ANOVA to conclude the factor effects and the possible path of effects.

  14. ANOVA with Summary Statistics: A STATA Macro

    Directory of Open Access Journals (Sweden)

    Nadeem Shafique Butt

    2006-01-01

    Full Text Available Almost all available statistical packages are capable of performing Analysis of Variance (ANOVA from raw data. Some of statistical packages have capability to perform independent sample t-test, and some other tests of significance on summary data, but seldom would you come across a software that has the capability to perform ANOVA directly on summary data. However some packages can perform one-way ANOVA after generating surrogate data from summary statistics. In this short note we have given STATA program to perform one-way ANOVA on summary data; in addition this program also performs Bartllet’s tests of equality of variances. The idea can be extended to two-way and higher way ANOVA’s. Examples have been given for illustration.

  15. ANOVA for the behavioral sciences researcher

    CERN Document Server

    Cardinal, Rudolf N

    2013-01-01

    This new book provides a theoretical and practical guide to analysis of variance (ANOVA) for those who have not had a formal course in this technique, but need to use this analysis as part of their research.From their experience in teaching this material and applying it to research problems, the authors have created a summary of the statistical theory underlying ANOVA, together with important issues, guidance, practical methods, references, and hints about using statistical software. These have been organized so that the student can learn the logic of the analytical techniques but also use the

  16. Scaling in ANOVA-simultaneous component analysis

    NARCIS (Netherlands)

    Timmerman, M.E; Hoefsloot, H.C.J.; Smilde, A.K.; Ceulemans, E.

    2015-01-01

    In omics research often high-dimensional data is collected according to an experimental design. Typically, the manipulations involved yield differential effects on subsets of variables. An effective approach to identify those effects is ANOVA-simultaneous component analysis (ASCA), which combines

  17. Permutation Tests for Stochastic Ordering and ANOVA

    CERN Document Server

    Basso, Dario; Salmaso, Luigi; Solari, Aldo

    2009-01-01

    Permutation testing for multivariate stochastic ordering and ANOVA designs is a fundamental issue in many scientific fields such as medicine, biology, pharmaceutical studies, engineering, economics, psychology, and social sciences. This book presents advanced methods and related R codes to perform complex multivariate analyses

  18. Modeling Applications.

    Science.gov (United States)

    McMEEKIN, Thomas A; Ross, Thomas

    1996-12-01

    The concept of predictive microbiology has developed rapidly through the initial phases of experimental design and model development and the subsequent phase of model validation. A fully validated model represents a general rule which may be brought to bear on particular cases. For some microorganism/food combinations, sufficient confidence now exists to indicate substantial benefits to the food industry from use of predictive models. Several types of devices are available to monitor and record environmental conditions (particularly temperature). These "environmental histories" can be interpreted, using predictive models, in terms of microbial proliferation. The current challenge is to provide systems for the collection and interpretation of environmental information which combine ease of use, reliability, and security, providing the industrial user with the ability to make informed and precise decisions regarding the quality and safety of foods. Many specific applications for predictive modeling can be developed from a basis of understanding the inherent qualities of a fully validated model. These include increased precision and confidence in predictions based on accumulation of quantitative data, objective and rapid assessment of the effect of environmental conditions on microbial proliferation, and flexibility in monitoring the relative contribution of component parts of processing, distribution, and storage systems for assurance of shelf life and safety.

  19. Inference for One-Way ANOVA with Equicorrelation Error Structure

    Directory of Open Access Journals (Sweden)

    Weiyan Mu

    2014-01-01

    Full Text Available We consider inferences in a one-way ANOVA model with equicorrelation error structures. Hypotheses of the equality of the means are discussed. A generalized F-test has been proposed by in the literature to compare the means of all populations. However, they did not discuss the performance of that test. We propose two methods, a generalized pivotal quantities-based method and a parametric bootstrap method, to test the hypotheses of equality of the means. We compare the empirical performance of the proposed tests with the generalized F-test. It can be seen from the simulation results that the generalized F-test does not perform well in terms of Type I error rate, and the proposed tests perform much better. We also provide corresponding simultaneous confidence intervals for all pair-wise differences of the means, whose coverage probabilities are close to the confidence level.

  20. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn [School of Information Science and Technology, ShanghaiTech University, Shanghai 200031 (China); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics & School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  1. Kronecker Product Analytical Approach to ANOVA of Surface ...

    African Journals Online (AJOL)

    Kronecker Product Analytical Approach to ANOVA of Surface Roughness Optimization. ... Journal of the Nigerian Association of Mathematical Physics ... Using the new method, the combination of controllable variables that optimized most the surface finish of machined workpiece materials was determined with Kronecker ...

  2. A new research paradigm for bivariate allometry: combining ANOVA and non-linear regression.

    Science.gov (United States)

    Packard, Gary C

    2018-04-06

    A novel statistical routine is presented here for exploring and comparing patterns of allometric variation in two or more groups of subjects. The routine combines elements of the analysis of variance (ANOVA) with non-linear regression to achieve the equivalent of an analysis of covariance (ANCOVA) on curvilinear data. The starting point is a three-parameter power equation to which a categorical variable has been added to identify membership by each subject in a specific group or treatment. The protocol differs from earlier ones in that different assumptions can be made about the form for random error in the full statistical model (i.e. normal and homoscedastic, normal and heteroscedastic, lognormal and heteroscedastic). The general equation and several modifications thereof were used to study allometric variation in field metabolic rates of marsupial and placental mammals. The allometric equations for both marsupials and placentals have an explicit, non-zero intercept, but the allometric exponent is higher in the equation for placentals than in that for marsupials. The approach followed here is extraordinarily versatile, and it has wider application in allometry than standard ANCOVA performed on logarithmic transformations. © 2018. Published by The Company of Biologists Ltd.

  3. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed......, as well as the high quality of the results presented in these accepted papers, demonstrate the maturity and vibrancy of the field....

  4. Extending the CLAST sequential rule to one-way ANOVA under group sampling.

    Science.gov (United States)

    Ximénez, Carmen; Revuelta, Javier

    2007-02-01

    Several studies have demonstrated that the fixed-sample stopping rule (FSR), in which the sample size is determined in advance, is less practical and efficient than are sequential-stopping rules. The composite limited adaptive sequential test (CLAST) is one such sequential-stopping rule. Previous research has shown that CLAST is more efficient in terms of sample size and power than are the FSR and other sequential rules and that it reflects more realistically the practice of experimental psychology researchers. The CLAST rule has been applied only to the t test of mean differences with two matched samples and to the chi-square independence test for twofold contingency tables. The present work extends previous research on the efficiency of CLAST to multiple group statistical tests. Simulation studies were conducted to test the efficiency of the CLAST rule for the one-way ANOVA for fixed effects models. The ANOVA general test and two linear contrasts of multiple comparisons among treatment means are considered. The article also introduces four rules for allocating N observations to J groups under the general null hypothesis and three allocation rules for the linear contrasts. Results show that the CLAST rule is generally more efficient than the FSR in terms of sample size and power for one-way ANOVA tests. However, the allocation rules vary in their optimality and have a differential impact on sample size and power. Thus, selecting an allocation rule depends on the cost of sampling and the intended precision.

  5. Fatigue of NiTi SMA-pulley system using Taguchi and ANOVA

    Science.gov (United States)

    Mohd Jani, Jaronie; Leary, Martin; Subic, Aleksandar

    2016-05-01

    Shape memory alloy (SMA) actuators can be integrated with a pulley system to provide mechanical advantage and to reduce packaging space; however, there appears to be no formal investigation of the effect of a pulley system on SMA structural or functional fatigue. In this work, cyclic testing was conducted on nickel-titanium (NiTi) SMA actuators on a pulley system and a control experiment (without pulley). Both structural and functional fatigues were monitored until fracture, or a maximum of 1E5 cycles were achieved for each experimental condition. The Taguchi method and analysis of the variance (ANOVA) were used to optimise the SMA-pulley system configurations. In general, one-way ANOVA at the 95% confidence level showed no significant difference between the structural or functional fatigue of SMA-pulley actuators and SMA actuators without pulley. Within the sample of SMA-pulley actuators, the effect of activation duration had the greatest significance for both structural and functional fatigue, and the pulley configuration (angle of wrap and sheave diameter) had a greater statistical significance than load magnitude for functional fatigue. This work identified that structural and functional fatigue performance of SMA-pulley systems is optimised by maximising sheave diameter and using an intermediate wrap-angle, with minimal load and activation duration. However, these parameters may not be compatible with commercial imperatives. A test was completed for a commercially optimal SMA-pulley configuration. This novel observation will be applicable to many areas of SMA-pulley system applications development.

  6. ANOVA parameters influence in LCF experimental data and simulation results

    Directory of Open Access Journals (Sweden)

    Vercelli A.

    2010-06-01

    Full Text Available The virtual design of components undergoing thermo mechanical fatigue (TMF and plastic strains is usually run in many phases. The numerical finite element method gives a useful instrument which becomes increasingly effective as the geometrical and numerical modelling gets more accurate. The constitutive model definition plays an important role in the effectiveness of the numerical simulation [1, 2] as, for example, shown in Figure 1. In this picture it is shown how a good cyclic plasticity constitutive model can simulate a cyclic load experiment. The component life estimation is the subsequent phase and it needs complex damage and life estimation models [3-5] which take into account of several parameters and phenomena contributing to damage and life duration. The calibration of these constitutive and damage models requires an accurate testing activity. In the present paper the main topic of the research activity is to investigate whether the parameters, which result to be influent in the experimental activity, influence the numerical simulations, thus defining the effectiveness of the models in taking into account of all the phenomena actually influencing the life of the component. To obtain this aim a procedure to tune the parameters needed to estimate the life of mechanical components undergoing TMF and plastic strains is presented for commercial steel. This procedure aims to be easy and to allow calibrating both material constitutive model (for the numerical structural simulation and the damage and life model (for life assessment. The procedure has been applied to specimens. The experimental activity has been developed on three sets of tests run at several temperatures: static tests, high cycle fatigue (HCF tests, low cycle fatigue (LCF tests. The numerical structural FEM simulations have been run on a commercial non linear solver, ABAQUS®6.8. The simulations replied the experimental tests. The stress, strain, thermal results from the thermo

  7. Constrained statistical inference: sample-size tables for ANOVA and regression.

    Science.gov (United States)

    Vanbrabant, Leonard; Van De Schoot, Rens; Rosseel, Yves

    2014-01-01

    Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient β1 is larger than β2 and β3. The corresponding hypothesis is H: β1 > {β2, β3} and this is known as an (order) constrained hypothesis. A major advantage of testing such a hypothesis is that power can be gained and inherently a smaller sample size is needed. This article discusses this gain in sample size reduction, when an increasing number of constraints is included into the hypothesis. The main goal is to present sample-size tables for constrained hypotheses. A sample-size table contains the necessary sample-size at a pre-specified power (say, 0.80) for an increasing number of constraints. To obtain sample-size tables, two Monte Carlo simulations were performed, one for ANOVA and one for multiple regression. Three results are salient. First, in an ANOVA the needed sample-size decreases with 30-50% when complete ordering of the parameters is taken into account. Second, small deviations from the imposed order have only a minor impact on the power. Third, at the maximum number of constraints, the linear regression results are comparable with the ANOVA results. However, in the case of fewer constraints, ordering the parameters (e.g., β1 > β2) results in a higher power than assigning a positive or a negative sign to the parameters (e.g., β1 > 0).

  8. Prediction and Control of Cutting Tool Vibration in Cnc Lathe with Anova and Ann

    Directory of Open Access Journals (Sweden)

    S. S. Abuthakeer

    2011-06-01

    Full Text Available Machining is a complex process in which many variables can deleterious the desired results. Among them, cutting tool vibration is the most critical phenomenon which influences dimensional precision of the components machined, functional behavior of the machine tools and life of the cutting tool. In a machining operation, the cutting tool vibrations are mainly influenced by cutting parameters like cutting speed, depth of cut and tool feed rate. In this work, the cutting tool vibrations are controlled using a damping pad made of Neoprene. Experiments were conducted in a CNC lathe where the tool holder is supported with and without damping pad. The cutting tool vibration signals were collected through a data acquisition system supported by LabVIEW software. To increase the buoyancy and reliability of the experiments, a full factorial experimental design was used. Experimental data collected were tested with analysis of variance (ANOVA to understand the influences of the cutting parameters. Empirical models have been developed using analysis of variance (ANOVA. Experimental studies and data analysis have been performed to validate the proposed damping system. Multilayer perceptron neural network model has been constructed with feed forward back-propagation algorithm using the acquired data. On the completion of the experimental test ANN is used to validate the results obtained and also to predict the behavior of the system under any cutting condition within the operating range. The onsite tests show that the proposed system reduces the vibration of cutting tool to a greater extend.

  9. Statistically significant contrasts between EMG waveforms revealed using wavelet-based functional ANOVA

    Science.gov (United States)

    McKay, J. Lucas; Welch, Torrence D. J.; Vidakovic, Brani

    2013-01-01

    We developed wavelet-based functional ANOVA (wfANOVA) as a novel approach for comparing neurophysiological signals that are functions of time. Temporal resolution is often sacrificed by analyzing such data in large time bins, increasing statistical power by reducing the number of comparisons. We performed ANOVA in the wavelet domain because differences between curves tend to be represented by a few temporally localized wavelets, which we transformed back to the time domain for visualization. We compared wfANOVA and ANOVA performed in the time domain (tANOVA) on both experimental electromyographic (EMG) signals from responses to perturbation during standing balance across changes in peak perturbation acceleration (3 levels) and velocity (4 levels) and on simulated data with known contrasts. In experimental EMG data, wfANOVA revealed the continuous shape and magnitude of significant differences over time without a priori selection of time bins. However, tANOVA revealed only the largest differences at discontinuous time points, resulting in features with later onsets and shorter durations than those identified using wfANOVA (P < 0.02). Furthermore, wfANOVA required significantly fewer (∼¼×; P < 0.015) significant F tests than tANOVA, resulting in post hoc tests with increased power. In simulated EMG data, wfANOVA identified known contrast curves with a high level of precision (r2 = 0.94 ± 0.08) and performed better than tANOVA across noise levels (P < <0.01). Therefore, wfANOVA may be useful for revealing differences in the shape and magnitude of neurophysiological signals (e.g., EMG, firing rates) across multiple conditions with both high temporal resolution and high statistical power. PMID:23100136

  10. Mathematical modeling with multidisciplinary applications

    CERN Document Server

    Yang, Xin-She

    2013-01-01

    Features mathematical modeling techniques and real-world processes with applications in diverse fields Mathematical Modeling with Multidisciplinary Applications details the interdisciplinary nature of mathematical modeling and numerical algorithms. The book combines a variety of applications from diverse fields to illustrate how the methods can be used to model physical processes, design new products, find solutions to challenging problems, and increase competitiveness in international markets. Written by leading scholars and international experts in the field, the

  11. Finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Features step-by-step examples based on actual data and connects fundamental mathematical modeling skills and decision making concepts to everyday applicability Featuring key linear programming, matrix, and probability concepts, Finite Mathematics: Models and Applications emphasizes cross-disciplinary applications that relate mathematics to everyday life. The book provides a unique combination of practical mathematical applications to illustrate the wide use of mathematics in fields ranging from business, economics, finance, management, operations research, and the life and social sciences.

  12. Multiple Regression as a Flexible Alternative to ANOVA in L2 Research

    Science.gov (United States)

    Plonsky, Luke; Oswald, Frederick L.

    2017-01-01

    Second language (L2) research relies heavily and increasingly on ANOVA (analysis of variance)-based results as a means to advance theory and practice. This fact alone should merit some reflection on the utility and value of ANOVA. It is possible that we could use this procedure more appropriately and, as argued here, other analyses such as…

  13. Multilevel models applications using SAS

    CERN Document Server

    Wang, Jichuan; Fisher, James F

    2011-01-01

    This book covers a broad range of topics about multilevel modeling. The goal is to help readers to understand the basic concepts, theoretical frameworks, and application methods of multilevel modeling. It is at a level also accessible to non-mathematicians, focusing on the methods and applications of various multilevel models and using the widely used statistical software SAS®. Examples are drawn from analysis of real-world research data.

  14. Atmospheric Models for Engineering Applications

    Science.gov (United States)

    Johnson, Dale L.; Roberts, Barry C.; Vaughan, William W.; Justus, C. G.

    2002-01-01

    This paper will review the historical development of reference and standard atmosphere models and their applications. The evolution of the U.S. Standard Atmosphere will be addressed, along with the Range Reference Atmospheres and, in particular, the NASA Global Reference Atmospheric Model (GRAM). The extensive scope and content of the GRAM will be addressed since it represents the most extensive and complete 'Reference' atmosphere model in use today. Its origin was for engineering applications and that remains today as its principal use.

  15. MARKETING MODELS APPLICATION EXPERIENCE

    Directory of Open Access Journals (Sweden)

    A. Yu. Rymanov

    2011-01-01

    Full Text Available Marketing models are used for the assessment of such marketing elements as sales volume, market share, market attractiveness, advertizing costs, product pushing and selling, profit, profitableness. Classification of buying process decision taking models is presented. SWOT- and GAPbased models are best for selling assessments. Lately, there is a tendency to transfer from the assessment on the ba-sis of financial indices to that on the basis of those non-financial. From the marketing viewpoint, most important are long-term company activities and consumer drawingmodels as well as market attractiveness operative models.

  16. Models for Dynamic Applications

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina

    2011-01-01

    This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor...

  17. Engine Modelling for Control Applications

    DEFF Research Database (Denmark)

    Hendricks, Elbert

    1997-01-01

    In earlier work published by the author and co-authors, a dynamic engine model called a Mean Value Engine Model (MVEM) was developed. This model is physically based and is intended mainly for control applications. In its newer form, it is easy to fit to many different engines and requires little ...

  18. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities...... to be applicable in organisations assigning a high importance to one or more factors that are known to be impacted by RD, while also experiencing a high level of occurrence of this factor. The RDAM supplements existing maturity models and metrics to provide a comprehensive set of data to support management....... The applicability assessment is based on two considerations: 1) Whether there is a correlation between the factors that are important to the project or organisation and the factors that impact from the use of RD and 2) What is the occurrence level of the given factor in the organisation. The RDAM defines RD...

  19. Human mobility: Models and applications

    Science.gov (United States)

    Barbosa, Hugo; Barthelemy, Marc; Ghoshal, Gourab; James, Charlotte R.; Lenormand, Maxime; Louail, Thomas; Menezes, Ronaldo; Ramasco, José J.; Simini, Filippo; Tomasini, Marcello

    2018-03-01

    Recent years have witnessed an explosion of extensive geolocated datasets related to human movement, enabling scientists to quantitatively study individual and collective mobility patterns, and to generate models that can capture and reproduce the spatiotemporal structures and regularities in human trajectories. The study of human mobility is especially important for applications such as estimating migratory flows, traffic forecasting, urban planning, and epidemic modeling. In this survey, we review the approaches developed to reproduce various mobility patterns, with the main focus on recent developments. This review can be used both as an introduction to the fundamental modeling principles of human mobility, and as a collection of technical methods applicable to specific mobility-related problems. The review organizes the subject by differentiating between individual and population mobility and also between short-range and long-range mobility. Throughout the text the description of the theory is intertwined with real-world applications.

  20. Optimization of Parameters for Manufacture Nanopowder Bioceramics at Machine Pulverisette 6 by Taguchi and ANOVA Method

    Science.gov (United States)

    Van Hoten, Hendri; Gunawarman; Mulyadi, Ismet Hari; Kurniawan Mainil, Afdhal; Putra, Bismantoloa dan

    2018-02-01

    This research is about manufacture nanopowder Bioceramics from local materials used Ball Milling for biomedical applications. Source materials for the manufacture of medicines are plants, animal tissues, microbial structures and engineering biomaterial. The form of raw material medicines is a powder before mixed. In the case of medicines, research is to find sources of biomedical materials that will be in the nanoscale powders can be used as raw material for medicine. One of the biomedical materials that can be used as raw material for medicine is of the type of bioceramics is chicken eggshells. This research will develop methods for manufacture nanopowder material from chicken eggshells with Ball Milling using the Taguchi method and ANOVA. Eggshell milled using a variation of Milling rate on 150, 200 and 250 rpm, the time variation of 1, 2 and 3 hours and variations the grinding balls to eggshell powder weight ratio (BPR) 1: 6, 1: 8, 1: 10. Before milled with Ball Milling crushed eggshells in advance and calcinate to a temperature of 900°C. After the milled material characterization of the fine powder of eggshell using SEM to see its size. The result of this research is optimum parameter of Taguchi Design analysis that is 250 rpm milling rate, 3 hours milling time and BPR is 1: 6 with the average eggshell powder size is 1.305 μm. Milling speed, milling time and ball to powder weight of ratio have contribution successively equal to 60.82%, 30.76% and 6.64% by error equal to 1.78%.

  1. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  2. Behavior Modeling -- Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes revised selected papers from the six International Workshops on Behavior Modelling - Foundations and Applications, BM-FA, which took place annually between 2009 and 2014. The 9 papers presented in this volume were carefully reviewed and selected from a total of 58 papers...

  3. A Priori Versus Post-Hoc: Comparing Statistical Power among ANOVA, Block Designs, and ANCOVA.

    Science.gov (United States)

    Wu, Yi-Cheng; McLean, James E.

    By employing a concomitant variable, block designs and analysis of covariance (ANCOVA) can be used to improve the power of traditional analysis of variance (ANOVA) by reducing error. If subjects are randomly assigned to treatments without considering the concomitant variable, an experiment uses a post-hoc approach. Otherwise, an a priori approach…

  4. Cautionary Note on Reporting Eta-Squared Values from Multifactor ANOVA Designs

    Science.gov (United States)

    Pierce, Charles A.; Block, Richard A.; Aguinis, Herman

    2004-01-01

    The authors provide a cautionary note on reporting accurate eta-squared values from multifactor analysis of variance (ANOVA) designs. They reinforce the distinction between classical and partial eta-squared as measures of strength of association. They provide examples from articles published in premier psychology journals in which the authors…

  5. Combining ANOVA-PCA with POCHEMON to analyse micro-organism development in a polymicrobial environment.

    Science.gov (United States)

    Geurts, Brigitte P; Neerincx, Anne H; Bertrand, Samuel; Leemans, Manja A A P; Postma, Geert J; Wolfender, Jean-Luc; Cristescu, Simona M; Buydens, Lutgarde M C; Jansen, Jeroen J

    2017-04-22

    Revealing the biochemistry associated to micro-organismal interspecies interactions is highly relevant for many purposes. Each pathogen has a characteristic metabolic fingerprint that allows identification based on their unique multivariate biochemistry. When pathogen species come into mutual contact, their co-culture will display a chemistry that may be attributed both to mixing of the characteristic chemistries of the mono-cultures and to competition between the pathogens. Therefore, investigating pathogen development in a polymicrobial environment requires dedicated chemometric methods to untangle and focus upon these sources of variation. The multivariate data analysis method Projected Orthogonalised Chemical Encounter Monitoring (POCHEMON) is dedicated to highlight metabolites characteristic for the interaction of two micro-organisms in co-culture. However, this approach is currently limited to a single time-point, while development of polymicrobial interactions may be highly dynamic. A well-known multivariate implementation of Analysis of Variance (ANOVA) uses Principal Component Analysis (ANOVA-PCA). This allows the overall dynamics to be separated from the pathogen-specific chemistry to analyse the contributions of both aspects separately. For this reason, we propose to integrate ANOVA-PCA with the POCHEMON approach to disentangle the pathogen dynamics and the specific biochemistry in interspecies interactions. Two complementary case studies show great potential for both liquid and gas chromatography - mass spectrometry to reveal novel information on chemistry specific to interspecies interaction during pathogen development. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  6. Association models for petroleum applications

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios

    2013-01-01

    Thermodynamics plays an important role in many applications in the petroleum industry, both upstream and downstream, ranging from flow assurance, (enhanced) oil recovery and control of chemicals to meet production and environmental regulations. There are many different applications in the oil & gas...... industry, thus thermodynamic data (phase behaviour, densities, speed of sound, etc) are needed to study a very diverse range of compounds in addition to the petroleum ones (CO2, H2S, water, alcohols, glycols, mercaptans, mercury, asphaltenes, waxes, polymers, electrolytes, biofuels, etc) within a very...... extensive range of conditions, up to very high pressures. Actually, the petroleum industry was one of the first industrial sectors which used extensively thermodynamic models and even contributed to the development of several of the most popular and still widely used approaches. While traditional...

  7. Integration of design applications with building models

    DEFF Research Database (Denmark)

    Eastman, C. M.; Jeng, T. S.; Chowdbury, R.

    1997-01-01

    This paper reviews various issues in the integration of applications with a building model... (Truncated.)......This paper reviews various issues in the integration of applications with a building model... (Truncated.)...

  8. Melvin Defleur's Information Communication Model: Its Application ...

    African Journals Online (AJOL)

    The paper discusses Melvin Defleur's information communication model and its application to archives administration. It provides relevant examples in which archives administration functions involve the communication process. Specific model elements and their application in archives administration are highlighted.

  9. Investigation of Significant Process Parameter in Manganese Phosphating of Piston Pin Material by Using ANOVA

    OpenAIRE

    Hemant V Chavan; Milind s Yadav

    2015-01-01

    The aim of this study is to determine the most significant parameter such as phosphating bath temperature, phosphating time,accelerator level on the fatigue life of piston pin material such as 40NiCr4Mo3 by analysis of variance (ANOVA).The selected three imput parameters were studied at three different level by conducting nine experiments based on L9 orthogonal array of Taguchi’s methodology.Phosphating bath temperature has significant effect on fatigue strength followed by phosph...

  10. Analysis of Main Economic Factors Influence on Romanian Tourists Number Accommodated in Romania, using Anova Method

    Directory of Open Access Journals (Sweden)

    Emilia Gabroveanu

    2009-05-01

    Full Text Available Now, it is indisputable the role, increasingly important, that tourism plays in economic and social development. This is reflected particularly through the tourism movement, which by generating revenue contributes to the creation of GDP.Size of tourist movement can be expressed through the following indicators: number of tourists, the number of check-in for nights, average length of holiday, density of tourist movements, the relative preference for travel, the revenue and the average number of tourists.The aim of this paper is to identify the main influence of economic factors - total income of households and consumer price indices, registered in Romania in the period 2001-2007 -, on the evolution Romanian tourists number accommodated in Romania, using the dispersion analysis method ANOVA.

  11. ANOVA IN MARKETING RESEARCH OF CONSUMER BEHAVIOR OF DIFFERENT CATEGORIES IN GEORGIAN MARKET

    Directory of Open Access Journals (Sweden)

    NUGZAR TODUA

    2015-03-01

    Full Text Available Consumer behavior research was conducted on bank services and (non-alcohol soft drinks. Based on four different currencies and ten services there are analyses made on bank clients’ distribution by bank services and currencies, percentage distribution by bank services, percentage distribution of bank services by currencies. Similar results are also received in case of ten soft drinks with their five characteristics: consumers quantities split by types of soft drinks and attributes; Attributes percentage split by types of soft drinks; Types of soft drinks percentage split by attributes. With usage of ANOVA, based on the marketing research outcomes it is concluded that bank clients’ total quantities i.e. populations’ unknown mean scores do not differ from each other. In the soft drinks research case consumers’ total quantities i.e. populations’ unknown mean scores vary by characteristics

  12. Estimating shipper/receiver measurement error variances by use of ANOVA

    International Nuclear Information System (INIS)

    Lanning, B.M.

    1993-01-01

    Every measurement made on nuclear material items is subject to measurement errors which are inherent variations in the measurement process that cause the measured value to differ from the true value. In practice, it is important to know the variance (or standard deviation) in these measurement errors, because this indicates the precision in reported results. If a nuclear material facility is generating paired data (e.g., shipper/receiver) where party 1 and party 2 each make independent measurements on the same items, the measurement error variance associated with both parties can be extracted. This paper presents a straightforward method for the use of standard statistical computer packages, with analysis of variance (ANOVA), to obtain valid estimates of measurement variances. Also, with the help of the P-value, significant biases between the two parties can be directly detected without reference to an F-table

  13. Chemistry Teachers' Knowledge and Application of Models

    Science.gov (United States)

    Wang, Zuhao; Chi, Shaohui; Hu, Kaiyan; Chen, Wenting

    2014-01-01

    Teachers' knowledge and application of model play an important role in students' development of modeling ability and scientific literacy. In this study, we investigated Chinese chemistry teachers' knowledge and application of models. Data were collected through test questionnaire and analyzed quantitatively and qualitatively. The result indicated…

  14. Association models for petroleum applications

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios

    2013-01-01

    Thermodynamics plays an important role in many applications in the petroleum industry, both upstream and downstream, ranging from flow assurance, (enhanced) oil recovery and control of chemicals to meet production and environmental regulations. There are many different applications in the oil & gas...

  15. A Classification of PLC Models and Applications

    NARCIS (Netherlands)

    Mader, Angelika H.; Boel, R.; Stremersch, G.

    In the past years there is an increasing interest in analysing PLC applications with formal methods. The first step to this end is to get formal models of PLC applications. Meanwhile, various models for PLCs have already been introduced in the literature. In our paper we discuss several

  16. Association models for petroleum applications

    OpenAIRE

    Kontogeorgis, Georgios

    2013-01-01

    Thermodynamics plays an important role in many applications in the petroleum industry, both upstream and downstream, ranging from flow assurance, (enhanced) oil recovery and control of chemicals to meet production and environmental regulations. There are many different applications in the oil & gas industry, thus thermodynamic data (phase behavior, densities, speed of sound, etc) are needed to study a very diverse range of compounds in addition to the petroleum ones (CO2,H2S, water, alcohol...

  17. Application of regression model on stream water quality parameters

    International Nuclear Information System (INIS)

    Suleman, M.; Maqbool, F.; Malik, A.H.; Bhatti, Z.A.

    2012-01-01

    Statistical analysis was conducted to evaluate the effect of solid waste leachate from the open solid waste dumping site of Salhad on the stream water quality. Five sites were selected along the stream. Two sites were selected prior to mixing of leachate with the surface water. One was of leachate and other two sites were affected with leachate. Samples were analyzed for pH, water temperature, electrical conductivity (EC), total dissolved solids (TDS), Biological oxygen demand (BOD), chemical oxygen demand (COD), dissolved oxygen (DO) and total bacterial load (TBL). In this study correlation coefficient r among different water quality parameters of various sites were calculated by using Pearson model and then average of each correlation between two parameters were also calculated, which shows TDS and EC and pH and BOD have significantly increasing r value, while temperature and TDS, temp and EC, DO and BL, DO and COD have decreasing r value. Single factor ANOVA at 5% level of significance was used which shows EC, TDS, TCL and COD were significantly differ among various sites. By the application of these two statistical approaches TDS and EC shows strongly positive correlation because the ions from the dissolved solids in water influence the ability of that water to conduct an electrical current. These two parameters significantly vary among 5 sites which are further confirmed by using linear regression. (author)

  18. Structural equation modeling methods and applications

    CERN Document Server

    Wang, Jichuan

    2012-01-01

    A reference guide for applications of SEM using Mplus Structural Equation Modeling: Applications Using Mplus is intended as both a teaching resource and a reference guide. Written in non-mathematical terms, this book focuses on the conceptual and practical aspects of Structural Equation Modeling (SEM). Basic concepts and examples of various SEM models are demonstrated along with recently developed advanced methods, such as mixture modeling and model-based power analysis and sample size estimate for SEM. The statistical modeling program, Mplus, is also featured and provides researchers with a

  19. Applications and extensions of degradation modeling

    International Nuclear Information System (INIS)

    Hsu, F.; Subudhi, M.; Samanta, P.K.; Vesely, W.E.

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs

  20. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. [Brookhaven National Lab., Upton, NY (United States); Vesely, W.E. [Science Applications International Corp., Columbus, OH (United States)

    1991-12-31

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  1. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. (Brookhaven National Lab., Upton, NY (United States)); Vesely, W.E. (Science Applications International Corp., Columbus, OH (United States))

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  2. Investigation of flood pattern using ANOVA statistic and remote sensing in Malaysia

    Science.gov (United States)

    Ya'acob, Norsuzila; Syazwani Ismail, Nor; Mustafa, Norfazira; Laily Yusof, Azita

    2014-06-01

    Flood is an overflow or inundation that comes from river or other body of water and causes or threatens damages. In Malaysia, there are no formal categorization of flood but often broadly categorized as monsoonal, flash or tidal floods. This project will be focus on flood causes by monsoon. For the last few years, the number of extreme flood was occurred and brings great economic impact. The extreme weather pattern is the main sector contributes for this phenomenon. In 2010, several districts in the states of Kedah neighbour-hoods state have been hit by floods and it is caused by tremendous weather pattern. During this tragedy, the ratio of the rainfalls volume was not fixed for every region, and the flood happened when the amount of water increase rapidly and start to overflow. This is the main objective why this project has been carried out, and the analysis data has been done from August until October in 2010. The investigation was done to find the possibility correlation pattern parameters related to the flood. ANOVA statistic was used to calculate the percentage of parameters was involved and Regression and correlation calculate the strength of coefficient among parameters related to the flood while remote sensing image was used for validation between the calculation accuracy. According to the results, the prediction is successful as the coefficient of relation in flood event is 0.912 and proved by Terra-SAR image on 4th November 2010. The rates of change in weather pattern give the impact to the flood.

  3. Investigation of flood pattern using ANOVA statistic and remote sensing in Malaysia

    International Nuclear Information System (INIS)

    Ya'acob, Norsuzila; Ismail, Nor Syazwani; Mustafa, Norfazira; Yusof, Azita Laily

    2014-01-01

    Flood is an overflow or inundation that comes from river or other body of water and causes or threatens damages. In Malaysia, there are no formal categorization of flood but often broadly categorized as monsoonal, flash or tidal floods. This project will be focus on flood causes by monsoon. For the last few years, the number of extreme flood was occurred and brings great economic impact. The extreme weather pattern is the main sector contributes for this phenomenon. In 2010, several districts in the states of Kedah neighbour-hoods state have been hit by floods and it is caused by tremendous weather pattern. During this tragedy, the ratio of the rainfalls volume was not fixed for every region, and the flood happened when the amount of water increase rapidly and start to overflow. This is the main objective why this project has been carried out, and the analysis data has been done from August until October in 2010. The investigation was done to find the possibility correlation pattern parameters related to the flood. ANOVA statistic was used to calculate the percentage of parameters was involved and Regression and correlation calculate the strength of coefficient among parameters related to the flood while remote sensing image was used for validation between the calculation accuracy. According to the results, the prediction is successful as the coefficient of relation in flood event is 0.912 and proved by Terra-SAR image on 4th November 2010. The rates of change in weather pattern give the impact to the flood

  4. Photocell modelling for thermophotovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Mayor, J.-C.; Durisch, W.; Grob, B.; Panitz, J.-C. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Goal of the modelling described here is the extrapolation of the performance characteristics of solar photocells to TPV working conditions. The model accounts for higher flux of radiation and for the higher temperatures reached in TPV converters. (author) 4 figs., 1 tab., 2 refs.

  5. Markov chains models, algorithms and applications

    CERN Document Server

    Ching, Wai-Ki; Ng, Michael K; Siu, Tak-Kuen

    2013-01-01

    This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data.This book consists of eight chapters.  Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods

  6. Dynamic programming models and applications

    CERN Document Server

    Denardo, Eric V

    2003-01-01

    Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.

  7. HTGR Application Economic Model Users' Manual

    International Nuclear Information System (INIS)

    Gandrik, A.M.

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  8. Computational nanophotonics modeling and applications

    CERN Document Server

    Musa, Sarhan M

    2013-01-01

    This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.

  9. Moving objects management models, techniques and applications

    CERN Document Server

    Meng, Xiaofeng; Xu, Jiajie

    2014-01-01

    This book describes the topics of moving objects modeling and location tracking, indexing and querying, clustering, location uncertainty, traffic aware navigation and privacy issues as well as the application to intelligent transportation systems.

  10. Measurement error models, methods, and applications

    CERN Document Server

    Buonaccorsi, John P

    2010-01-01

    Over the last 20 years, comprehensive strategies for treating measurement error in complex models and accounting for the use of extra data to estimate measurement error parameters have emerged. Focusing on both established and novel approaches, ""Measurement Error: Models, Methods, and Applications"" provides an overview of the main techniques and illustrates their application in various models. It describes the impacts of measurement errors on naive analyses that ignore them and presents ways to correct for them across a variety of statistical models, from simple one-sample problems to regres

  11. GLOBAL REFERENCE ATMOSPHERIC MODELS FOR AEROASSIST APPLICATIONS

    Science.gov (United States)

    Duvall, Aleta; Justus, C. G.; Keller, Vernon W.

    2005-01-01

    Aeroassist is a broad category of advanced transportation technology encompassing aerocapture, aerobraking, aeroentry, precision landing, hazard detection and avoidance, and aerogravity assist. The eight destinations in the Solar System with sufficient atmosphere to enable aeroassist technology are Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for five of these targets - Earth, Mars, Titan, Neptune, and Venus - have been developed at NASA's Marshall Space Flight Center. These models are useful as tools in mission planning and systems analysis studies associated with aeroassist applications. The series of models is collectively named the Global Reference Atmospheric Model or GRAM series. An important capability of all the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analysis in developing guidance, navigation and control algorithms, for aerothermal design, and for other applications sensitive to atmospheric variability. Recent example applications are discussed.

  12. A Transitive Model For Artificial Intelligence Applications

    Science.gov (United States)

    Dwyer, John

    1986-03-01

    A wide range of mathematical techniques have been applied to artificial intelligence problems and some techniques have proved more suitable than others for certain types of problem. We formally define a mathematical model which incorporates some of these successful techniques and we discuss its intrinsic properties. Universal applicability of the model is demonstrated through specific applications to problems drawn from rule-based systems, digital hardware design and constraint satisfaction networks. We also give indications of potential applications to other artificial intelligence problems, including knowledge engineering.

  13. Formal models, languages and applications

    CERN Document Server

    Rangarajan, K; Mukund, M

    2006-01-01

    A collection of articles by leading experts in theoretical computer science, this volume commemorates the 75th birthday of Professor Rani Siromoney, one of the pioneers in the field in India. The articles span the vast range of areas that Professor Siromoney has worked in or influenced, including grammar systems, picture languages and new models of computation. Sample Chapter(s). Chapter 1: Finite Array Automata and Regular Array Grammars (150 KB). Contents: Finite Array Automata and Regular Array Grammars (A Atanasiu et al.); Hexagonal Contextual Array P Systems (K S Dersanambika et al.); Con

  14. Homogeneity tests for variances and mean test under heterogeneity conditions in a single way ANOVA method

    International Nuclear Information System (INIS)

    Morales P, J.R.; Avila P, P.

    1996-01-01

    If we have consider the maximum permissible levels showed for the case of oysters, it results forbidding to collect oysters at the four stations of the El Chijol Channel ( Veracruz, Mexico), as well as along the channel itself, because the metal concentrations studied exceed these limits. In this case the application of Welch tests were not necessary. For the water hyacinth the means of the treatments were unequal in Fe, Cu, Ni, and Zn. This case is more illustrative, for the conclusion has been reached through the application of the Welch tests to treatments with heterogeneous variances. (Author)

  15. Applications of computer modeling to fusion research

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.

  16. Applications of computer modeling to fusion research

    International Nuclear Information System (INIS)

    Dawson, J.M.

    1989-01-01

    Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling

  17. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  18. Vacation queueing models theory and applications

    CERN Document Server

    Tian, Naishuo

    2006-01-01

    A classical queueing model consists of three parts - arrival process, service process, and queue discipline. However, a vacation queueing model has an additional part - the vacation process which is governed by a vacation policy - that can be characterized by three aspects: 1) vacation start-up rule; 2) vacation termination rule, and 3) vacation duration distribution. Hence, vacation queueing models are an extension of classical queueing theory. Vacation Queueing Models: Theory and Applications discusses systematically and in detail the many variations of vacation policy. By allowing servers to take vacations makes the queueing models more realistic and flexible in studying real-world waiting line systems. Integrated in the book's discussion are a variety of typical vacation model applications that include call centers with multi-task employees, customized manufacturing, telecommunication networks, maintenance activities, etc. Finally, contents are presented in a "theorem and proof" format and it is invaluabl...

  19. Polymer networks: Modeling and applications

    Science.gov (United States)

    Masoud, Hassan

    Polymer networks are an important class of materials that are ubiquitously found in natural, biological, and man-made systems. The complex mesoscale structure of these soft materials has made it difficult for researchers to fully explore their properties. In this dissertation, we introduce a coarse-grained computational model for permanently cross-linked polymer networks than can properly capture common properties of these materials. We use this model to study several practical problems involving dry and solvated networks. Specifically, we analyze the permeability and diffusivity of polymer networks under mechanical deformations, we examine the release of encapsulated solutes from microgel capsules during volume transitions, and we explore the complex tribological behavior of elastomers. Our simulations reveal that the network transport properties are defined by the network porosity and by the degree of network anisotropy due to mechanical deformations. In particular, the permeability of mechanically deformed networks can be predicted based on the alignment of network filaments that is characterized by a second order orientation tensor. Moreover, our numerical calculations demonstrate that responsive microcapsules can be effectively utilized for steady and pulsatile release of encapsulated solutes. We show that swollen gel capsules allow steady, diffusive release of nanoparticles and polymer chains, whereas gel deswelling causes burst-like discharge of solutes driven by an outward flow of the solvent initially enclosed within a shrinking capsule. We further demonstrate that this hydrodynamic release can be regulated by introducing rigid microscopic rods in the capsule interior. We also probe the effects of velocity, temperature, and normal load on the sliding of elastomers on smooth and corrugated substrates. Our friction simulations predict a bell-shaped curve for the dependence of the friction coefficient on the sliding velocity. Our simulations also illustrate

  20. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  1. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  2. Degenerate RFID Channel Modeling for Positioning Applications

    Directory of Open Access Journals (Sweden)

    A. Povalac

    2012-12-01

    Full Text Available This paper introduces the theory of channel modeling for positioning applications in UHF RFID. It explains basic parameters for channel characterization from both the narrowband and wideband point of view. More details are given about ranging and direction finding. Finally, several positioning scenarios are analyzed with developed channel models. All the described models use a degenerate channel, i.e. combined signal propagation from the transmitter to the tag and from the tag to the receiver.

  3. Artificial Immune Networks: Models and Applications

    Directory of Open Access Journals (Sweden)

    Xian Shen

    2008-06-01

    Full Text Available Artificial Immune Systems (AIS, which is inspired by the nature immune system, has been applied for solving complex computational problems in classification, pattern rec- ognition, and optimization. In this paper, the theory of the natural immune system is first briefly introduced. Next, we compare some well-known AIS and their applications. Several representative artificial immune networks models are also dis- cussed. Moreover, we demonstrate the applications of artificial immune networks in various engineering fields.

  4. The Influencing Factor Analysis on the Performance Evaluation of Assembly Line Balancing Problem Level 1 (SALBP-1) Based on ANOVA Method

    Science.gov (United States)

    Chen, Jie; Hu, Jiangnan

    2017-06-01

    Industry 4.0 and lean production has become the focus of manufacturing. A current issue is to analyse the performance of the assembly line balancing. This study focus on distinguishing the factors influencing the assembly line balancing. The one-way ANOVA method is applied to explore the significant degree of distinguished factors. And regression model is built to find key points. The maximal task time (tmax ), the quantity of tasks (n), and degree of convergence of precedence graph (conv) are critical for the performance of assembly line balancing. The conclusion will do a favor to the lean production in the manufacturing.

  5. Application Note: Power Grid Modeling With Xyce.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    This application note describes how to model steady-state power flows and transient events in electric power grids with the SPICE-compatible Xyce TM Parallel Electronic Simulator developed at Sandia National Labs. This application notes provides a brief tutorial on the basic devices (branches, bus shunts, transformers and generators) found in power grids. The focus is on the features supported and assumptions made by the Xyce models for power grid elements. It then provides a detailed explanation, including working Xyce netlists, for simulating some simple power grid examples such as the IEEE 14-bus test case.

  6. Advances and applications of occupancy models

    Science.gov (United States)

    Bailey, Larissa; MacKenzie, Darry I.; Nichols, James D.

    2013-01-01

    Summary: The past decade has seen an explosion in the development and application of models aimed at estimating species occurrence and occupancy dynamics while accounting for possible non-detection or species misidentification. We discuss some recent occupancy estimation methods and the biological systems that motivated their development. Collectively, these models offer tremendous flexibility, but simultaneously place added demands on the investigator. Unlike many mark–recapture scenarios, investigators utilizing occupancy models have the ability, and responsibility, to define their sample units (i.e. sites), replicate sampling occasions, time period over which species occurrence is assumed to be static and even the criteria that constitute ‘detection’ of a target species. Subsequent biological inference and interpretation of model parameters depend on these definitions and the ability to meet model assumptions. We demonstrate the relevance of these definitions by highlighting applications from a single biological system (an amphibian–pathogen system) and discuss situations where the use of occupancy models has been criticized. Finally, we use these applications to suggest future research and model development.

  7. Characterizing and modeling web sessions with applications

    OpenAIRE

    Chiarandini, Luca

    2014-01-01

    This thesis focuses on the analysis and modeling of web sessions, groups of requests made by a single user for a single navigation purpose. Understanding how people browse through websites is important, helping us to improve interfaces and provide to better content. After first conducting a statistical analysis of web sessions, we go on to present algorithms to summarize and model web sessions. Finally, we describe applications that use novel browsing methods, in particular parallel...

  8. Deformation Models Tracking, Animation and Applications

    CERN Document Server

    Torres, Arnau; Gómez, Javier

    2013-01-01

    The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications.  The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, s...

  9. Ionospheric Modeling for Precise GNSS Applications

    NARCIS (Netherlands)

    Memarzadeh, Y.

    2009-01-01

    The main objective of this thesis is to develop a procedure for modeling and predicting ionospheric Total Electron Content (TEC) for high precision differential GNSS applications. As the ionosphere is a highly dynamic medium, we believe that to have a reliable procedure it is necessary to transfer

  10. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  11. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  12. Mixed models theory and applications with R

    CERN Document Server

    Demidenko, Eugene

    2013-01-01

    Mixed modeling is one of the most promising and exciting areas of statistical analysis, enabling the analysis of nontraditional, clustered data that may come in the form of shapes or images. This book provides in-depth mathematical coverage of mixed models' statistical properties and numerical algorithms, as well as applications such as the analysis of tumor regrowth, shape, and image. The new edition includes significant updating, over 300 exercises, stimulating chapter projects and model simulations, inclusion of R subroutines, and a revised text format. The target audience continues to be g

  13. Multilevel modelling: Beyond the basic applications.

    Science.gov (United States)

    Wright, Daniel B; London, Kamala

    2009-05-01

    Over the last 30 years statistical algorithms have been developed to analyse datasets that have a hierarchical/multilevel structure. Particularly within developmental and educational psychology these techniques have become common where the sample has an obvious hierarchical structure, like pupils nested within a classroom. We describe two areas beyond the basic applications of multilevel modelling that are important to psychology: modelling the covariance structure in longitudinal designs and using generalized linear multilevel modelling as an alternative to methods from signal detection theory (SDT). Detailed code for all analyses is described using packages for the freeware R.

  14. Link mining models, algorithms, and applications

    CERN Document Server

    Yu, Philip S; Faloutsos, Christos

    2010-01-01

    This book presents in-depth surveys and systematic discussions on models, algorithms and applications for link mining. Link mining is an important field of data mining. Traditional data mining focuses on 'flat' data in which each data object is represented as a fixed-length attribute vector. However, many real-world data sets are much richer in structure, involving objects of multiple types that are related to each other. Hence, recently link mining has become an emerging field of data mining, which has a high impact in various important applications such as text mining, social network analysi

  15. Application of Improved Radiation Modeling to General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Michael J Iacono

    2011-04-07

    This research has accomplished its primary objectives of developing accurate and efficient radiation codes, validating them with measurements and higher resolution models, and providing these advancements to the global modeling community to enhance the treatment of cloud and radiative processes in weather and climate prediction models. A critical component of this research has been the development of the longwave and shortwave broadband radiative transfer code for general circulation model (GCM) applications, RRTMG, which is based on the single-column reference code, RRTM, also developed at AER. RRTMG is a rigorously tested radiation model that retains a considerable level of accuracy relative to higher resolution models and measurements despite the performance enhancements that have made it possible to apply this radiation code successfully to global dynamical models. This model includes the radiative effects of all significant atmospheric gases, and it treats the absorption and scattering from liquid and ice clouds and aerosols. RRTMG also includes a statistical technique for representing small-scale cloud variability, such as cloud fraction and the vertical overlap of clouds, which has been shown to improve cloud radiative forcing in global models. This development approach has provided a direct link from observations to the enhanced radiative transfer provided by RRTMG for application to GCMs. Recent comparison of existing climate model radiation codes with high resolution models has documented the improved radiative forcing capability provided by RRTMG, especially at the surface, relative to other GCM radiation models. Due to its high accuracy, its connection to observations, and its computational efficiency, RRTMG has been implemented operationally in many national and international dynamical models to provide validated radiative transfer for improving weather forecasts and enhancing the prediction of global climate change.

  16. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2015-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 13-15, 2014. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, healthcare, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  17. Modeling and Optimization : Theory and Applications Conference

    CERN Document Server

    Terlaky, Tamás

    2017-01-01

    This volume contains a selection of contributions that were presented at the Modeling and Optimization: Theory and Applications Conference (MOPTA) held at Lehigh University in Bethlehem, Pennsylvania, USA on August 17-19, 2016. The conference brought together a diverse group of researchers and practitioners, working on both theoretical and practical aspects of continuous or discrete optimization. Topics presented included algorithms for solving convex, network, mixed-integer, nonlinear, and global optimization problems, and addressed the application of deterministic and stochastic optimization techniques in energy, finance, logistics, analytics, health, and other important fields. The contributions contained in this volume represent a sample of these topics and applications and illustrate the broad diversity of ideas discussed at the meeting.

  18. Stochastic biomathematical models with applications to neuronal modeling

    CERN Document Server

    Batzel, Jerry; Ditlevsen, Susanne

    2013-01-01

    Stochastic biomathematical models are becoming increasingly important as new light is shed on the role of noise in living systems. In certain biological systems, stochastic effects may even enhance a signal, thus providing a biological motivation for the noise observed in living systems. Recent advances in stochastic analysis and increasing computing power facilitate the analysis of more biophysically realistic models, and this book provides researchers in computational neuroscience and stochastic systems with an overview of recent developments. Key concepts are developed in chapters written by experts in their respective fields. Topics include: one-dimensional homogeneous diffusions and their boundary behavior, large deviation theory and its application in stochastic neurobiological models, a review of mathematical methods for stochastic neuronal integrate-and-fire models, stochastic partial differential equation models in neurobiology, and stochastic modeling of spreading cortical depression.

  19. Optimum Combination and Effect Analysis of Piezoresistor Dimensions in Micro Piezoresistive Pressure Sensor Using Design of Experiments and ANOVA: a Taguchi Approach

    Directory of Open Access Journals (Sweden)

    Kirankumar B. Balavalad

    2017-04-01

    Full Text Available Piezoresistive (PZR pressure sensors have gained importance because of their robust construction, high sensitivity and good linearity. The conventional PZR pressure sensor consists of 4 piezoresistors placed on diaphragm and are connected in the form of Wheatstone bridge. These sensors convert stress applied on them into change in resistance, which is quantified into voltage using Wheatstone bridge mechanism. It is observed form the literature that, the dimensions of piezoresistors are very crucial in the performance of the piezoresistive pressure sensor. This paper presents, a novel mechanism of finding best combinations and effect of individual piezoresistors dimensions viz., Length, Width and Thickness, using DoE and ANOVA (Analysis of Variance method, following Taguchi experimentation approach. The paper presents a unique method to find optimum combination of piezoresistors dimensions and also clearly illustrates the effect the dimensions on the output of the sensor. The optimum combinations and the output response of sensor is predicted using DoE and the validation simulation is done. The result of the validation simulation is compared with the predicted value of sensor response i.e., V. Predicted value of V is 1.074 V and the validation simulation gave the response for V as 1.19 V. This actually validates that the model (DoE and ANOVA is adequate in describing V in terms of the variables defined.

  20. Chapter 5: Summary of model application

    International Nuclear Information System (INIS)

    1995-01-01

    This chapter provides a brief summary of the model applications described in Volume III of the Final Report. This chapter dealt with the selected water management regimes; ground water flow regimes; agriculture; ground water quality; hydrodynamics, sediment transport and water quality in the Danube; hydrodynamics, sediment transport and water quality in the river branch system; hydrodynamics, sediment transport and water quality in the Hrusov reservoir and with ecology in this Danube area

  1. The influence of speed abilities and technical skills in early adolescence on adult success in soccer: A long-term prospective analysis using ANOVA and SEM approaches

    Science.gov (United States)

    2017-01-01

    Several talent development programs in youth soccer have implemented motor diagnostics measuring performance factors. However, the predictive value of such tests for adult success is a controversial topic in talent research. This prospective cohort study evaluated the long-term predictive value of 1) motor tests and 2) players’ speed abilities (SA) and technical skills (TS) in early adolescence. The sample consisted of 14,178 U12 players from the German talent development program. Five tests (sprint, agility, dribbling, ball control, shooting) were conducted and players’ height, weight as well as relative age were assessed at nationwide diagnostics between 2004 and 2006. In the 2014/15 season, the players were then categorized as professional (n = 89), semi-professional (n = 913), or non-professional players (n = 13,176), indicating their adult performance level (APL). The motor tests’ prognostic relevance was determined using ANOVAs. Players’ future success was predicted by a logistic regression threshold model. This structural equation model comprised a measurement model with the motor tests and two correlated latent factors, SA and TS, with simultaneous consideration for the manifest covariates height, weight and relative age. Each motor predictor and anthropometric characteristic discriminated significantly between the APL (p < .001; η2 ≤ .02). The threshold model significantly predicted the APL (R2 = 24.8%), and in early adolescence the factor TS (p < .001) seems to have a stronger effect on adult performance than SA (p < .05). Both approaches (ANOVA, SEM) verified the diagnostics’ predictive validity over a long-term period (≈ 9 years). However, because of the limited effect sizes, the motor tests’ prognostic relevance remains ambiguous. A challenge for future research lies in the integration of different (e.g., person-oriented or multilevel) multivariate approaches that expand beyond the “traditional” topic of single tests’ predictive

  2. THE CONNECTION IDENTIFICATION BETWEEN THE NET INVESTMENTS IN HOTELS AND RESTORANTS AND TOURISTIC ACCOMODATION CAPACITY BY USING THE ANOVA METHOD

    Directory of Open Access Journals (Sweden)

    Elena STAN

    2009-12-01

    Full Text Available In the purpose of giving the answers to customers’ harsh exigencies, in the Romanian tourism development hasto be taking into account especially the “accommodation” component. The dimension of technical and material base ofaccommodation can be express through: units’ number, rooms’ number, places number. The most used is “placesnumber” indicator. Nowadays as regarding the tourism Romanian investments there are special concerns caused bypeculiar determinations. The study aim is represented by identifying of a connection existence between net investmentsin hotels and restaurants and tourism accommodation capacity, registered among 2002 -2007period in Romania, byusing the dispersion analysis ANOVA method.

  3. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  4. Application of Chaos Theory to Psychological Models

    Science.gov (United States)

    Blackerby, Rae Fortunato

    This dissertation shows that an alternative theoretical approach from physics--chaos theory--offers a viable basis for improved understanding of human beings and their behavior. Chaos theory provides achievable frameworks for potential identification, assessment, and adjustment of human behavior patterns. Most current psychological models fail to address the metaphysical conditions inherent in the human system, thus bringing deep errors to psychological practice and empirical research. Freudian, Jungian and behavioristic perspectives are inadequate psychological models because they assume, either implicitly or explicitly, that the human psychological system is a closed, linear system. On the other hand, Adlerian models that require open systems are likely to be empirically tenable. Logically, models will hold only if the model's assumptions hold. The innovative application of chaotic dynamics to psychological behavior is a promising theoretical development because the application asserts that human systems are open, nonlinear and self-organizing. Chaotic dynamics use nonlinear mathematical relationships among factors that influence human systems. This dissertation explores these mathematical relationships in the context of a sample model of moral behavior using simulated data. Mathematical equations with nonlinear feedback loops describe chaotic systems. Feedback loops govern the equations' value in subsequent calculation iterations. For example, changes in moral behavior are affected by an individual's own self-centeredness, family and community influences, and previous moral behavior choices that feed back to influence future choices. When applying these factors to the chaos equations, the model behaves like other chaotic systems. For example, changes in moral behavior fluctuate in regular patterns, as determined by the values of the individual, family and community factors. In some cases, these fluctuations converge to one value; in other cases, they diverge in

  5. Genetic demographic networks: Mathematical model and applications.

    Science.gov (United States)

    Kimmel, Marek; Wojdyła, Tomasz

    2016-10-01

    Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise

  6. Artery Buckling: New Phenotypes, Models, and Applications

    Science.gov (United States)

    Han, Hai-Chao; Chesnutt, Jennifer K. W.; Garcia, Justin R.; Liu, Qin; Wen, Qi

    2012-01-01

    Arteries are under significant mechanical loads from blood pressure, flow, tissue tethering, and body movement. It is critical that arteries remain patent and stable under these loads. This review summarizes the common forms of buckling that occur in blood vessels including cross-sectional collapse, longitudinal twist buckling, and bent buckling. The phenomena, model analyses, experimental measurements, effects on blood flow, and clinical relevance are discussed. It is concluded that mechanical buckling is an important issue for vasculature, in addition to wall stiffness and strength, and requires further studies to address the challenges. Studies of vessel buckling not only enrich vascular biomechanics but also have important clinical applications. PMID:23192265

  7. A conceptual holding model for veterinary applications

    Directory of Open Access Journals (Sweden)

    Nicola Ferrè

    2014-05-01

    Full Text Available Spatial references are required when geographical information systems (GIS are used for the collection, storage and management of data. In the veterinary domain, the spatial component of a holding (of animals is usually defined by coordinates, and no other relevant information needs to be interpreted or used for manipulation of the data in the GIS environment provided. Users trying to integrate or reuse spatial data organised in such a way, frequently face the problem of data incompatibility and inconsistency. The root of the problem lies in differences with respect to syntax as well as variations in the semantic, spatial and temporal representations of the geographic features. To overcome these problems and to facilitate the inter-operability of different GIS, spatial data must be defined according to a “schema” that includes the definition, acquisition, analysis, access, presentation and transfer of such data between different users and systems. We propose an application “schema” of holdings for GIS applications in the veterinary domain according to the European directive framework (directive 2007/2/EC - INSPIRE. The conceptual model put forward has been developed at two specific levels to produce the essential and the abstract model, respectively. The former establishes the conceptual linkage of the system design to the real world, while the latter describes how the system or software works. The result is an application “schema” that formalises and unifies the information-theoretic foundations of how to spatially represent a holding in order to ensure straightforward information-sharing within the veterinary community.

  8. Generalized data stacking programming model with applications

    Directory of Open Access Journals (Sweden)

    Hala Samir Elhadidy

    2016-09-01

    Full Text Available Recent researches have shown that, everywhere in various sciences the systems are following stacked-based stored change behavior when subjected to events or varying environments “on and above” their normal situations. This paper presents a generalized data stack programming (GDSP model which is developed to describe the system changes under varying environment. These changes which are captured with different ways such as sensor reading are stored in matrices. Extraction algorithm and identification technique are proposed to extract the different layers between images and identify the stack class the object follows; respectively. The general multi-stacking network is presented including the interaction between various stack-based layering of some applications. The experiments prove that the concept of stack matrix gives average accuracy of 99.45%.

  9. Automatic Performance Model Generation for Java Enterprise Edition (EE) Applications

    OpenAIRE

    Brunnert, Andreas;Vögele, Christian;Krcmar, Helmut

    2015-01-01

    The effort required to create performance models for enterprise applications is often out of proportion compared to their benefits. This work aims to reduce this effort by introducing an approach to automatically generate component-based performance models for running Java EE applications. The approach is applicable for all Java EE server products as it relies on standardized component types and interfaces to gather the required data for modeling an application. The feasibility of the approac...

  10. Python-Based Applications for Hydrogeological Modeling

    Science.gov (United States)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The

  11. Genetic model compensation: Theory and applications

    Science.gov (United States)

    Cruickshank, David Raymond

    1998-12-01

    The adaptive filtering algorithm known as Genetic Model Compensation (GMC) was originally presented in the author's Master's Thesis. The current work extends this earlier work. GMC uses a genetic algorithm to optimize filter process noise parameters in parallel with the estimation of the state and based only on the observational information available to the filter. The original stochastic state model underlying GMC was inherited from the antecedent, non-adaptive Dynamic Model Compensation (DMC) algorithm. The current work develops the stochastic state model from a linear system viewpoint, avoiding the simplifications and approximations of the earlier development, and establishes Riemann sums as unbiased estimators of the stochastic integrals which describe the evolution of the random state components. These are significant developments which provide GMC with a solid theoretical foundation. Orbit determination is the area of application in this work, and two types of problems are studied: real-time autonomous filtering using absolute GPS measurements and precise post-processed filtering using differential GPS measurements. The first type is studied in a satellite navigation simulation in which pseudorange and pseudorange rate measurements are processed by an Extended Kalman Filter which incorporates both DMC and GMC. Both estimators are initialized by a geometric point solution algorithm. Using measurements corrupted by simulated Selective Availability errors, GMC reduces mean RSS position error by 6.4 percent, reduces mean clock bias error by 46 percent, and displays a marked improvement in covariance consistency relative to DMC. To study the second type of problem, GMC is integrated with NASA Jet Propulsion Laboratory's Gipsy/Oasis-II (GOA-II) precision orbit determination program creating an adaptive version of GOA-II's Reduced Dynamic Tracking (RDT) process noise formulation. When run as a sequential estimator with GPS measurements from the TOPEX satellite and

  12. Semi-empirical prediction of moisture build-up in an electronic enclosure using analysis of variance (ANOVA)

    DEFF Research Database (Denmark)

    Shojaee Nasirabadi, Parizad; Conseil, Helene; Mohanty, Sankhya

    2016-01-01

    Electronic systems are exposed to harsh environmental conditions such as high humidity in many applications. Moisture transfer into electronic enclosures and condensation can cause several problems as material degradation and corrosion. Therefore, it is important to control the moisture content...... and the relative humidity inside electronic enclosures. In this work, moisture transfer into a typical polycarbonate electronic enclosure with a cylindrical shape opening is studied. The effects of four influential parameters namely, initial relative humidity inside the enclosure, radius and length of the opening...... and temperature are studied. A set of experiments are done based on a fractional factorial design in order to estimate the time constant for moisture transfer into the enclosure by fitting the experimental data to an analytical quasi-steady-state model. According to the statistical analysis, temperature...

  13. Optimization of tensile strength of friction welded AISI 1040 and AISI 304L steels according to statistics analysis (ANOVA)

    Energy Technology Data Exchange (ETDEWEB)

    Kirik, Ihsan [Batman Univ. (Turkey); Ozdemir, Niyazi; Firat, Emrah Hanifi; Caligulu, Ugur [Firat Univ., Elazig (Turkey)

    2013-06-01

    Materials difficult to weld by fusion welding processes can be successfully welded by friction welding. The strength of the friction welded joints is extremely affected by process parameters (rotation speed, friction time, friction pressure, forging time, and forging pressure). In this study, statistical values of tensile strength were investigated in terms of rotation speed, friction time, and friction pressure on the strength behaviours of friction welded AISI 1040 and AISI 304L alloys. Then, the tensile test results were analyzed by analysis of variance (ANOVA) with a confidence level of 95 % to find out whether a statistically significant difference occurs. As a result of this study, the maximum tensile strength is very close, which that of AISI 1040 parent metal of 637 MPa to could be obtained for the joints fabricated under the welding conditions of rotation speed of 1700 rpm, friction pressure of 50 MPa, forging pressure of 100 MPa, friction time of 4 s, and forging time of 2 s. Rotation speed, friction time, and friction pressure on the friction welding of AISI 1040 and AISI 304L alloys were statistically significant regarding tensile strength test values. (orig.)

  14. Building adaptable and reusable XML applications with model transformations

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas

    2005-01-01

    We present an approach in which the semantics of an XML language is defined by means of a transformation from an XML document model (an XML schema) to an application specific model. The application specific model implements the intended behavior of documents written in the language. A transformation

  15. GOCE Exploitation for Moho Modeling and Applications

    Science.gov (United States)

    Sampierto, D.

    2011-07-01

    New ESA missions dedicated to the observation of the Earth from space, like the gravity-gradiometry mission GOCE and the radar altimetry mission CRYOSAT 2, foster research, among other subjects, also on inverse gravimetric problems and on the description of the nature and the geographical location of gravimetric signals. In this framework the GEMMA project (GOCE Exploitation for Moho Modeling and Applications), funded by the European Space Agency and Politecnico di Milano, aims at estimating the boundary between Earth's crust and mantle (the so called Mohorovičić discontinuity or Moho) from GOCE data in key regions of the world. In the project a solution based on a simple two layer model in spherical approximation is proposed. This inversion problem based on the linearization of the Newton's gravitational law around an approximate mean Moho surface will be solved by exploiting Wiener-Kolmogorov theory in the frequency domain where the depth of the Moho discontinuity will be treated as a random signal with a zero mean and its own covariance function. The algorithm can be applied in a numerically efficient way by using the Fast Fourier Transform. As for the gravity observations, we will consider grids of the anomalous gravitational potential and its second radial derivative at satellite altitude. In particular this will require first of all to elaborate GOCE data to obtain a local grid of the gravitational potential field and its second radial derivative and after that to separate the gravimetric signal due to the considered discontinuity from the gravitational effects of other geological structures present into the observations. The first problem can be solved by applying the so called space- wise approach to GOCE observations, while the second one can be achieved by considering a priori models and geophysical information by means of an appropriate Bayesan technique. Moreover other data such as ground gravity anomalies or seismic profiles can be combined, in an

  16. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  17. Computational nanotechnology modeling and applications with MATLAB

    National Research Council Canada - National Science Library

    Musa, Sarhan M

    2012-01-01

    .... Offering thought-provoking perspective on the developments that are poised to revolutionize the field, the author explores both existing and future nanotechnology applications, which hold great...

  18. Application of multidimensional IRT models to longitudinal data

    NARCIS (Netherlands)

    te Marvelde, J.M.; Glas, Cornelis A.W.; Van Landeghem, Georges; Van Damme, Jan

    2006-01-01

    The application of multidimensional item response theory (IRT) models to longitudinal educational surveys where students are repeatedly measured is discussed and exemplified. A marginal maximum likelihood (MML) method to estimate the parameters of a multidimensional generalized partial credit model

  19. Computer-aided modelling template: Concept and application

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2015-01-01

    Modelling is an important enabling technology in modern chemical engineering applications. A template-based approach is presented in this work to facilitate the construction and documentation of the models and enable their maintenance for reuse in a wider application range. Based on a model...... information and comments on model construction, storage and future use/reuse. The application of the tool is highlighted with a multi-scale modelling case study involving a catalytic membrane fixed bed reactor and a two-phase system for oxidation of unsaturated acid with hydrogen peroxide. Both case studies...

  20. An Integrated Model of Application, Admission, Enrollment, and Financial Aid

    Science.gov (United States)

    DesJardins, Stephen L.; Ahlburg, Dennis A.; McCall, Brian Patrick

    2006-01-01

    We jointly model the application, admission, financial aid determination, and enrollment decision process. We find that expectations of admission affect application probabilities, financial aid expectations affect enrollment and application behavior, and deviations from aid expectations are strongly related to enrollment. We also conduct…

  1. Optimizing Injection Molding Parameters of Different Halloysites Type-Reinforced Thermoplastic Polyurethane Nanocomposites via Taguchi Complemented with ANOVA

    Directory of Open Access Journals (Sweden)

    Tayser Sumer Gaaz

    2016-11-01

    Taguchi and ANOVA approaches. Seemingly, mHNTs has shown its very important role in the resulting product.

  2. Optimizing Injection Molding Parameters of Different Halloysites Type-Reinforced Thermoplastic Polyurethane Nanocomposites via Taguchi Complemented with ANOVA.

    Science.gov (United States)

    Gaaz, Tayser Sumer; Sulong, Abu Bakar; Kadhum, Abdul Amir H; Nassir, Mohamed H; Al-Amiery, Ahmed A

    2016-11-22

    out by coordinating Taguchi and ANOVA approaches. Seemingly, m HNTs has shown its very important role in the resulting product.

  3. Surface Flux Modeling for Air Quality Applications

    Directory of Open Access Journals (Sweden)

    Limei Ran

    2011-08-01

    Full Text Available For many gasses and aerosols, dry deposition is an important sink of atmospheric mass. Dry deposition fluxes are also important sources of pollutants to terrestrial and aquatic ecosystems. The surface fluxes of some gases, such as ammonia, mercury, and certain volatile organic compounds, can be upward into the air as well as downward to the surface and therefore should be modeled as bi-directional fluxes. Model parameterizations of dry deposition in air quality models have been represented by simple electrical resistance analogs for almost 30 years. Uncertainties in surface flux modeling in global to mesoscale models are being slowly reduced as more field measurements provide constraints on parameterizations. However, at the same time, more chemical species are being added to surface flux models as air quality models are expanded to include more complex chemistry and are being applied to a wider array of environmental issues. Since surface flux measurements of many of these chemicals are still lacking, resistances are usually parameterized using simple scaling by water or lipid solubility and reactivity. Advances in recent years have included bi-directional flux algorithms that require a shift from pre-computation of deposition velocities to fully integrated surface flux calculations within air quality models. Improved modeling of the stomatal component of chemical surface fluxes has resulted from improved evapotranspiration modeling in land surface models and closer integration between meteorology and air quality models. Satellite-derived land use characterization and vegetation products and indices are improving model representation of spatial and temporal variations in surface flux processes. This review describes the current state of chemical dry deposition modeling, recent progress in bi-directional flux modeling, synergistic model development research with field measurements, and coupling with meteorological land surface models.

  4. Part 7: Application of the IAWQ model

    African Journals Online (AJOL)

    drinie

    The IAWQ Activated Sludge Model (ASM) No. 2 is a kinetic-based model and incorporates two simple processes for chemical precipitation and redissolution that are readily integrated with biological processes for carbon, nitrogen and phosphorus removal. This model was applied to experimental data collected as part of this ...

  5. Mixture Modeling: Applications in Educational Psychology

    Science.gov (United States)

    Harring, Jeffrey R.; Hodis, Flaviu A.

    2016-01-01

    Model-based clustering methods, commonly referred to as finite mixture modeling, have been applied to a wide variety of cross-sectional and longitudinal data to account for heterogeneity in population characteristics. In this article, we elucidate 2 such approaches: growth mixture modeling and latent profile analysis. Both techniques are…

  6. Pinna Model for Hearing Instrument Applications

    DEFF Research Database (Denmark)

    Kammersgaard, Nikolaj Peter Iversen; Kvist, Søren Helstrup; Thaysen, Jesper

    2014-01-01

    A novel model of the pinna (outer ear) is presented. This is to increase the understanding of the effect of the pinna on the on-body radiation pattern of an antenna placed inside the ear. Simulations of the model and of a realistically shaped ear are compared to validate the model. The radiation ...

  7. Sparse Multivariate Modeling: Priors and Applications

    DEFF Research Database (Denmark)

    Henao, Ricardo

    This thesis presents a collection of statistical models that attempt to take advantage of every piece of prior knowledge available to provide the models with as much structure as possible. The main motivation for introducing these models is interpretability since in practice we want to be able to...

  8. Structural Equation Modeling with Mplus Basic Concepts, Applications, and Programming

    CERN Document Server

    Byrne, Barbara M

    2011-01-01

    Modeled after Barbara Byrne's other best-selling structural equation modeling (SEM) books, this practical guide reviews the basic concepts and applications of SEM using Mplus Versions 5 & 6. The author reviews SEM applications based on actual data taken from her own research. Using non-mathematical language, it is written for the novice SEM user. With each application chapter, the author "walks" the reader through all steps involved in testing the SEM model including: an explanation of the issues addressed illustrated and annotated testing of the hypothesized and post hoc models expl

  9. Application of autoregressive moving average model in reactor noise analysis

    International Nuclear Information System (INIS)

    Tran Dinh Tri

    1993-01-01

    The application of an autoregressive (AR) model to estimating noise measurements has achieved many successes in reactor noise analysis in the last ten years. The physical processes that take place in the nuclear reactor, however, are described by an autoregressive moving average (ARMA) model rather than by an AR model. Consequently more correct results could be obtained by applying the ARMA model instead of the AR model to reactor noise analysis. In this paper the system of the generalised Yule-Walker equations is derived from the equation of an ARMA model, then a method for its solution is given. Numerical results show the applications of the method proposed. (author)

  10. Markov and mixed models with applications

    DEFF Research Database (Denmark)

    Mortensen, Stig Bousgaard

    the individual in almost any thinkable way. This project focuses on measuring the eects on sleep in both humans and animals. The sleep process is usually analyzed by categorizing small time segments into a number of sleep states and this can be modelled using a Markov process. For this purpose new methods...... for non-parametric estimation of Markov processes are proposed to give a detailed description of the sleep process during the night. Statistically the Markov models considered for sleep states are closely related to the PK models based on SDEs as both models share the Markov property. When the models...

  11. Nonlinear dynamics new directions models and applications

    CERN Document Server

    Ugalde, Edgardo

    2015-01-01

    This book, along with its companion volume, Nonlinear Dynamics New Directions: Theoretical Aspects, covers topics ranging from fractal analysis to very specific applications of the theory of dynamical systems to biology. This second volume contains mostly new applications of the theory of dynamical systems to both engineering and biology. The first volume is devoted to fundamental aspects and includes a number of important new contributions as well as some review articles that emphasize new development prospects. The topics addressed in the two volumes include a rigorous treatment of fluctuations in dynamical systems, topics in fractal analysis, studies of the transient dynamics in biological networks, synchronization in lasers, and control of chaotic systems, among others. This book also: ·         Develops applications of nonlinear dynamics on a diversity of topics such as patterns of synchrony in neuronal networks, laser synchronization, control of chaotic systems, and the study of transient dynam...

  12. Application of lumped-parameter models

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    This technical report concerns the lumped-parameter models for a suction caisson with a ratio between skirt length and foundation diameter equal to 1/2, embedded into an viscoelastic soil. The models are presented for three different values of the shear modulus of the subsoil. Subsequently, the assembly of the dynamic stiffness matrix for the foundation is considered, and the solution for obtaining the steady state response, when using lumped-parameter models is given. (au)

  13. Application of Actuarial Modelling in Insurance Industry

    OpenAIRE

    Burcã Ana-Maria; Bãtrînca Ghiorghe

    2011-01-01

    In insurance industry, the financial stability of insurance companies represents an issue of vital importance. In order to maintain the financial stability and meet minimum regulatory requirements, actuaries apply actuarial modeling. Modeling has been at the center of actuarial science and of all the sciences from the beginning of their journey. In insurance industry, actuarial modeling creates a framework that allows actuaries to identify, understand, quantify and manage a wide range of risk...

  14. Application of lumped-parameter models

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Liingaard, Morten

    This technical report concerns the lumped-parameter models for a suction caisson with a ratio between skirt length and foundation diameter equal to 1/2, embedded into an viscoelastic soil. The models are presented for three different values of the shear modulus of the subsoil (section 1.1). Subse......This technical report concerns the lumped-parameter models for a suction caisson with a ratio between skirt length and foundation diameter equal to 1/2, embedded into an viscoelastic soil. The models are presented for three different values of the shear modulus of the subsoil (section 1...

  15. New advances in statistical modeling and applications

    CERN Document Server

    Santos, Rui; Oliveira, Maria; Paulino, Carlos

    2014-01-01

    This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.

  16. Nuclear structure models: Applications and development

    International Nuclear Information System (INIS)

    Semmes, P.B.

    1992-07-01

    This report discusses the following topics: Studies of superdeformed States; Signature Inversion in Odd-Odd Nuclei: A fingerprint of Triaxiality; Signature Inversion in 120 Cs - Evidence for a Residual p-n Interaction; Signatures of γ Deformation in Nuclei and an Application to 125 Xe; Nuclear Spins and Moments: Fundamental Structural Information; and Electromagnetic Properties of 181 Ir: Evidence of β Stretching

  17. Applications: simple models and difficult theorems

    NARCIS (Netherlands)

    Litvak, Nelli; van de Geer, Sara; Wegkamp, Marten

    2012-01-01

    In this short article I will discuss three papers written by Willem van Zwet with three different co-authors: Mathisca de Gunst, Marta Fiocco, and myself. Each of the papers focuses on one particular application: growth of the number of biological cells [3], spreading of an infection [7], and the

  18. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  19. Asteroid thermal modeling: recent developments and applications

    NARCIS (Netherlands)

    Harris, A. W.; Mueller, M.

    2006-01-01

    A variety of thermal models are used for the derivation of asteroid physical parameters from thermal-infrared observations Simple models based on spherical geometry are often adequate for obtaining sizes and albedos when very little information about an object is available However sophisticated

  20. Optical Coherence Tomography: Modeling and Applications

    DEFF Research Database (Denmark)

    Thrane, Lars

    An analytical model is presented that is able to describe the performance of OCT systems in both the single and multiple scattering regimes simultaneously. This model inherently includes the shower curtain effect, well-known for light propagation through the atmosphere. This effect has been omitted...... in previous theoretical models of OCT systems. It is demonstrated that the shower curtain effect is of utmost importance in the theoretical description of an OCT system. The analytical model, together with proper noise analysis of the OCT system, enables calculation of the SNR, where the optical properties...... geometry, i.e., reflection geometry, is developed. As in the new OCT model, multiple scattered photons has been taken into account together with multiple scattering effects. As an important result, a novel method of creating images based on measurements of the momentum width of the Wigner phase...

  1. Development and application of air quality models at the US ...

    Science.gov (United States)

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. Linear and Generalized Linear Mixed Models and Their Applications

    CERN Document Server

    Jiang, Jiming

    2007-01-01

    This book covers two major classes of mixed effects models, linear mixed models and generalized linear mixed models, and it presents an up-to-date account of theory and methods in analysis of these models as well as their applications in various fields. The book offers a systematic approach to inference about non-Gaussian linear mixed models. Furthermore, it has included recently developed methods, such as mixed model diagnostics, mixed model selection, and jackknife method in the context of mixed models. The book is aimed at students, researchers and other practitioners who are interested

  3. Fuzzy modeling and control theory and applications

    CERN Document Server

    Matía, Fernando; Jiménez, Emilio

    2014-01-01

    Much work on fuzzy control, covering research, development and applications, has been developed in Europe since the 90's. Nevertheless, the existing books in the field are compilations of articles without interconnection or logical structure or they express the personal point of view of the author. This book compiles the developments of researchers with demonstrated experience in the field of fuzzy control following a logic structure and a unified the style. The first chapters of the book are dedicated to the introduction of the main fuzzy logic techniques, where the following chapters focus on concrete applications. This book is supported by the EUSFLAT and CEA-IFAC societies, which include a large number of researchers in the field of fuzzy logic and control. The central topic of the book, Fuzzy Control, is one of the main research and development lines covered by these associations.

  4. Some Issues of Biological Shape Modelling with Applications

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Skoglund, Karl

    2003-01-01

    This paper illustrates current research at Informatics and Mathematical Modelling at the Technical University of Denmark within biological shape modelling. We illustrate a series of generalizations to, modifications to, and applications of the elements of constructing models of shape or appearanc...

  5. The application of a hierarchical Bayesian spatiotemporal model for ...

    Indian Academy of Sciences (India)

    2005.09.070. Sahu S K and Bakar K S 2012 Hierarchical bayesian autore- gressive models for large space-time data with application to ozone concentration modeling; Appl. Stochastic Models. Bus. Ind. 28 395–415, doi: 10.1002/asmb.1951.

  6. Simple mathematical models of symmetry breaking. Application to particle physics

    International Nuclear Information System (INIS)

    Michel, L.

    1976-01-01

    Some mathematical facts relevant to symmetry breaking are presented. A first mathematical model deals with the smooth action of compact Lie groups on real manifolds, a second model considers linear action of any group on real or complex finite dimensional vector spaces. Application of the mathematical models to particle physics is considered. (B.R.H.)

  7. Modeling Students' Memory for Application in Adaptive Educational Systems

    Science.gov (United States)

    Pelánek, Radek

    2015-01-01

    Human memory has been thoroughly studied and modeled in psychology, but mainly in laboratory setting under simplified conditions. For application in practical adaptive educational systems we need simple and robust models which can cope with aspects like varied prior knowledge or multiple-choice questions. We discuss and evaluate several models of…

  8. Four Applications of the TIGRIS Model in the Netherlands

    NARCIS (Netherlands)

    Eradus, P.; Schoenmakers, A.; van der Hoorn, A.I.J.M.

    2002-01-01

    This paper presents the land-use transportation interaction model TIGRIS for the Netherlands. Four studies have been conducted in the past few years using increasingly sophisticated versions of the model. The paper places the model applications in their geographical context, provides an overview of

  9. Mathematical modeling and applications in nonlinear dynamics

    CERN Document Server

    Merdan, Hüseyin

    2016-01-01

    The book covers nonlinear physical problems and mathematical modeling, including molecular biology, genetics, neurosciences, artificial intelligence with classical problems in mechanics and astronomy and physics. The chapters present nonlinear mathematical modeling in life science and physics through nonlinear differential equations, nonlinear discrete equations and hybrid equations. Such modeling can be effectively applied to the wide spectrum of nonlinear physical problems, including the KAM (Kolmogorov-Arnold-Moser (KAM)) theory, singular differential equations, impulsive dichotomous linear systems, analytical bifurcation trees of periodic motions, and almost or pseudo- almost periodic solutions in nonlinear dynamical systems. Provides methods for mathematical models with switching, thresholds, and impulses, each of particular importance for discontinuous processes Includes qualitative analysis of behaviors on Tumor-Immune Systems and methods of analysis for DNA, neural networks and epidemiology Introduces...

  10. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  11. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  12. Model Order Reduction: Application to Electromagnetic Problems

    OpenAIRE

    Paquay, Yannick

    2017-01-01

    With the increase in computational resources, numerical modeling has grown expo- nentially these last two decades. From structural analysis to combustion modeling and electromagnetics, discretization methods–in particular the finite element method–have had a tremendous impact. Their main advantage consists in a correct representation of dynamical and nonlinear behaviors by solving equations at local scale, however the spatial discretization inherent to such approaches is also its main drawbac...

  13. A Component-based Programming Model for Composite, Distributed Applications

    Science.gov (United States)

    Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.

  14. HTGR Application Economic Model Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    A.M. Gandrik

    2012-01-01

    The High Temperature Gas-Cooled Reactor (HTGR) Application Economic Model was developed at the Idaho National Laboratory for the Next Generation Nuclear Plant Project. The HTGR Application Economic Model calculates either the required selling price of power and/or heat for a given internal rate of return (IRR) or the IRR for power and/or heat being sold at the market price. The user can generate these economic results for a range of reactor outlet temperatures; with and without power cycles, including either a Brayton or Rankine cycle; for the demonstration plant, first of a kind, or nth of a kind project phases; for up to 16 reactor modules; and for module ratings of 200, 350, or 600 MWt. This users manual contains the mathematical models and operating instructions for the HTGR Application Economic Model. Instructions, screenshots, and examples are provided to guide the user through the HTGR Application Economic Model. This model was designed for users who are familiar with the HTGR design and Excel and engineering economics. Modification of the HTGR Application Economic Model should only be performed by users familiar with the HTGR and its applications, Excel, and Visual Basic.

  15. Recognizing textual entailment models and applications

    CERN Document Server

    Dagan, Ido; Sammons, Mark

    2013-01-01

    In the last few years, a number of NLP researchers have developed and participated in the task of Recognizing Textual Entailment (RTE). This task encapsulates Natural Language Understanding capabilities within a very simple interface: recognizing when the meaning of a text snippet is contained in the meaning of a second piece of text. This simple abstraction of an exceedingly complex problem has broad appeal partly because it can be conceived also as a component in other NLP applications, from Machine Translation to Semantic Search to Information Extraction. It also avoids commitment to any sp

  16. Specification, Model Generation, and Verification of Distributed Applications

    OpenAIRE

    Madelaine, Eric

    2011-01-01

    Since 2001, in the Oasis team, I have developed research on the semantics of applications based on distributed objects, applying in the context of a real language, and applications of realistic size, my previous researches in the field of process algebras. The various aspects of this work naturally include behavioral semantics and the definition of procedures for model generation, taking into account the different concepts of distributed applications, but also upstream, static code analysis a...

  17. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  18. Simulation models generator. Applications in scheduling

    Directory of Open Access Journals (Sweden)

    Omar Danilo Castrillón

    2013-08-01

    Rev.Mate.Teor.Aplic. (ISSN 1409-2433 Vol. 20(2: 231–241, July 2013 generador de modelos de simulacion 233 will, in order to have an approach to reality to evaluate decisions in order to take more assertive. To test prototype was used as the modeling example of a production system with 9 machines and 5 works as a job shop configuration, testing stops processing times and stochastic machine to measure rates of use of machines and time average jobs in the system, as measures of system performance. This test shows the goodness of the prototype, to save the user the simulation model building

  19. Sparse modeling theory, algorithms, and applications

    CERN Document Server

    Rish, Irina

    2014-01-01

    ""A comprehensive, clear, and well-articulated book on sparse modeling. This book will stand as a prime reference to the research community for many years to come.""-Ricardo Vilalta, Department of Computer Science, University of Houston""This book provides a modern introduction to sparse methods for machine learning and signal processing, with a comprehensive treatment of both theory and algorithms. Sparse Modeling is an ideal book for a first-year graduate course.""-Francis Bach, INRIA - École Normale Supřieure, Paris

  20. Co-clustering models, algorithms and applications

    CERN Document Server

    Govaert, Gérard

    2013-01-01

    Cluster or co-cluster analyses are important tools in a variety of scientific areas. The introduction of this book presents a state of the art of already well-established, as well as more recent methods of co-clustering. The authors mainly deal with the two-mode partitioning under different approaches, but pay particular attention to a probabilistic approach. Chapter 1 concerns clustering in general and the model-based clustering in particular. The authors briefly review the classical clustering methods and focus on the mixture model. They present and discuss the use of different mixture

  1. A cutting force model for micromilling applications

    DEFF Research Database (Denmark)

    Bissacco, Giuliano; Hansen, Hans Nørgaard; De Chiffre, Leonardo

    2006-01-01

    In micro milling the maximum uncut chip thickness is often smaller than the cutting edge radius. This paper introduces a new cutting force model for ball nose micro milling that is capable of taking into account the effect of the edge radius.......In micro milling the maximum uncut chip thickness is often smaller than the cutting edge radius. This paper introduces a new cutting force model for ball nose micro milling that is capable of taking into account the effect of the edge radius....

  2. Model castings with composite surface layer - application

    Directory of Open Access Journals (Sweden)

    J. Szajnar

    2008-10-01

    Full Text Available The paper presents a method of usable properties of surface layers improvement of cast carbon steel 200–450, by put directly in foundingprocess a composite surface layer on the basis of Fe-Cr-C alloy. Technology of composite surface layer guarantee mainly increase inhardness and aberasive wear resistance of cast steel castings on machine elements. This technology can be competition for generallyapplied welding technology (surfacing by welding and thermal spraying. In range of studies was made cast steel test castings withcomposite surface layer, which usability for industrial applications was estimated by criterion of hardness and aberasive wear resistance of type metal-mineral and quality of joint cast steel – (Fe-Cr-C. Based on conducted studies a thesis, that composite surface layer arise from liquid state, was formulated. Moreover, possible is control of composite layer thickness and its hardness by suitable selection of parameters i.e. thickness of insert, pouring temperature and solidification modulus of casting. Possibility of technology application of composite surface layer in manufacture of cast steel slide bush for combined cutter loader is presented.

  3. Possibilities of the Statistical Scoring Models' Application at Lithuanian Banks

    OpenAIRE

    Dzidzevičiūtė, Laima

    2013-01-01

    The goal of this dissertation is to develop the rating system of Lithuanian companies based on the statistical scoring model and assess the possibilities of this system‘s application at Lithuanian banks. The dissertation consists of three Chapters. Development and application peculiarities of rating systems based on statistical scoring models are described in the first Chapter. In the second Chapter the results of the survey of commercial banks and foreign bank branches, operating in the coun...

  4. An ocean scatter propagation model for aeronautical satellite communication applications

    Science.gov (United States)

    Moreland, K. W.

    1990-01-01

    In this paper an ocean scattering propagation model, developed for aircraft-to-satellite (aeronautical) applications, is described. The purpose of the propagation model is to characterize the behavior of sea reflected multipath as a function of physical propagation path parameters. An accurate validation against the theoretical far field solution for a perfectly conducting sinusoidal surface is provided. Simulation results for typical L band aeronautical applications with low complexity antennas are presented.

  5. Handbook of mixed membership models and their applications

    CERN Document Server

    Airoldi, Edoardo M; Erosheva, Elena A; Fienberg, Stephen E

    2014-01-01

    In response to scientific needs for more diverse and structured explanations of statistical data, researchers have discovered how to model individual data points as belonging to multiple groups. Handbook of Mixed Membership Models and Their Applications shows you how to use these flexible modeling tools to uncover hidden patterns in modern high-dimensional multivariate data. It explores the use of the models in various application settings, including survey data, population genetics, text analysis, image processing and annotation, and molecular biology.Through examples using real data sets, yo

  6. Venusian Applications of 3D Convection Modeling

    Science.gov (United States)

    Bonaccorso, Timary Annie

    2011-01-01

    This study models mantle convection on Venus using the 'cubed sphere' code OEDIPUS, which models one-sixth of the planet in spherical geometry. We are attempting to balance internal heating, bottom mantle viscosity, and temperature difference across Venus' mantle, in order to create a realistic model that matches with current planetary observations. We also have begun to run both lower and upper mantle simulations to determine whether layered (as opposed to whole-mantle) convection might produce more efficient heat transfer, as well as to model coronae formation in the upper mantle. Upper mantle simulations are completed using OEDIPUS' Cartesian counterpart, JOCASTA. This summer's central question has been how to define a mantle plume. Traditionally, we have defined a hot plume the region with temperature at or above 40% of the difference between the maximum and horizontally averaged temperature, and a cold plume as the region with 40% of the difference between the minimum and average temperature. For less viscous cases (1020 Pa?s), the plumes generated by that definition lacked vigor, displaying buoyancies 1/100th of those found in previous, higher viscosity simulations (1021 Pa?s). As the mantle plumes with large buoyancy flux are most likely to produce topographic uplift and volcanism, the low viscosity cases' plumes may not produce observable deformation. In an effort to eliminate the smallest plumes, we experimented with different lower bound parameters and temperature percentages.

  7. Application of reflection in model transformation languages

    NARCIS (Netherlands)

    Ivanov, Ivan

    Computational reflection is a well known technique applied in many existing programming languages ranging from functional to object-oriented languages. In this paper we study the possibilities and benefits of introducing and using reflection in rule-based model transformation languages. The paper

  8. The application of an empowerment model

    NARCIS (Netherlands)

    Molleman, E; van Delft, B; Slomp, J

    2001-01-01

    In this study we applied an empowerment model that focuses on (a) the need for empowerment in light of organizational strategy, (b) job design issues such as job enlargement and job enrichment that facilitate empowerment, and (c) the abilities, and (d) the attitudes of workers that make empowerment

  9. System identification application using Hammerstein model

    Indian Academy of Sciences (India)

    Saban Ozer

    . 20(1): 1175–1188. [4] Hizir N B, Phan M Q, Betti R and Longman R W 2012. Identification of discrete-time bilinear systems through equivalent linear models. Nonlinear Dyn. 69(4): 2065–2078. [5] Hong X, Mitchell R J, Chen S, Harris C J, Li K ...

  10. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  11. System identification application using Hammerstein model

    Indian Academy of Sciences (India)

    Saban Ozer

    because of its advanced theoretical background [3–5, 10]. However, many systems in real life have nonlinear beha- ... To describe a polynomial non-linear system with memory, the Volterra series expansion has been the ... suppression and adaptive noise suppression [19]. 2.3 Hammerstein model. Many systems can be ...

  12. Some studies on cutting force and temperature in machining Ti-6Al-4V alloy using regression analysis and ANOVA

    Directory of Open Access Journals (Sweden)

    K.Satyanarayana

    2013-06-01

    Full Text Available The present work deals with the cutting forces and cutting temperature produced during turning of titanium alloy Ti-6Al-4V with PVD TiN coated tungsten carbide inserts under dry environment. The 1st order mathematical models are developed using multiple regression analysis and optimized the process parameters using contour plots. The model presented high determination coefficient (R2 = 0.964 and 0.989 explaining 96.4 % and 98.9 % of the variability in the cutting force and cutting temperature, which indicates the goodness of fit for the model and high significance of the model. The developed mathematical model correlates the relationship of the cutting force and temperature with the process parameters with good degree of approximation. From the contour plots, the optimal parametric combination for lowest cutting force is v 3 (75 m/min – f 1 (0.25 mm/rev. Similarly, the optimal parametric combination for minimum temperature is v 1 (45 m/min – f 1 (0.25 mm/rev. Cutting speed is found to be the most significance parameter on cutting forces followed by feed. Similarly, for cutting temperature, feed is found to be the most influencing parameter followed by cutting speed.

  13. Application of Prognostic Mesoscale Modeling in the Southeast United States

    International Nuclear Information System (INIS)

    Buckley, R.L.

    1999-01-01

    A prognostic model is being used to provide regional forecasts for a variety of applications at the Savannah River Site (SRS). Emergency response dispersion models available at SRS use the space and time-dependent meteorological data provided by this model to supplement local and regional observations. Output from the model is also used locally to aid in forecasting at SRS, and regionally in providing forecasts of the potential time and location of hurricane landfall within the southeast United States

  14. Desublimation process: verification and applications of a theoretical model

    International Nuclear Information System (INIS)

    Eby, R.S.

    1979-01-01

    A theoretical model simulating the simultaneous heat and mass transfer which takes place during the desublimation of a gas to a solid is presented. Desublimer column loading profiles to experimentally verify the model were obtained using a gamma scintillation technique. The data indicate that, if the physical parameters of the desublimed frost material are known, the model can accurately predict the desublimation phenomenon. The usefulness of the model in different engineering applications is also addressed

  15. Potential model application and planning issues

    Directory of Open Access Journals (Sweden)

    Christiane Weber

    2000-03-01

    Full Text Available Le modèle de potentiel a été et reste un modèle d'interaction spatiale utilisé pour diverses problématiques en sciences humaines, cependant l'utilisation qu'en ont fait Donnay (1997,1995,1994 et Binard (1995 en introduisant des résultats de traitement d'images comme support d'application a ouvert la voie à des applications novatrice par exemple, pour la détermination de la limite urbaine ou des hinterlands locaux. Les articulations possibles entre application du modèle de potentiel en imagerie et utilisation de plans de Système d'Information Géographique ont permis l'évaluation temporelle des tendances de développement urbain (Weber,1998. Reprenant cette idée, l'étude proposée tente d'identifier les formes de développement urbain de la Communauté urbaine de Strasbourg (CUS en tenant compte de l'occupation du sol, des caractéristiques des réseaux de communication, des réglementations urbaines et des contraintes environnementales qui pèsent sur la zone d'étude. L'état initial de l'occupation du sol, obtenu par traitements statistiques, est utilisé comme donnée d'entrée du modèle de potentiel afin d'obtenir des surfaces de potentiel associées à des caractéristiques spatiales spécifiques soit  : l'extension de la forme urbaine, la préservation des zones naturelles ou d'agricultures, ou encore les réglementations. Les résultats sont ensuite combinés et classés. Cette application a été menée pour confronter la méthode au développement réel de la CUS déterminé par une étude diachronique par comparaison d'images satellites (SPOT1986- SPOT1998. Afin de vérifier l'intérêt et la justesse de la méthode les résultats satellites ont été opposés à ceux issus de la classification des surfaces de potentiel. Les zones de développement identifiées en fonction du modèle de potentiel ont été confirmées par les résultats de l'analyse temporelle faite sur les images. Une différenciation de zones en

  16. Adaptive Networks Theory, Models and Applications

    CERN Document Server

    Gross, Thilo

    2009-01-01

    With adaptive, complex networks, the evolution of the network topology and the dynamical processes on the network are equally important and often fundamentally entangled. Recent research has shown that such networks can exhibit a plethora of new phenomena which are ultimately required to describe many real-world networks. Some of those phenomena include robust self-organization towards dynamical criticality, formation of complex global topologies based on simple, local rules, and the spontaneous division of "labor" in which an initially homogenous population of network nodes self-organizes into functionally distinct classes. These are just a few. This book is a state-of-the-art survey of those unique networks. In it, leading researchers set out to define the future scope and direction of some of the most advanced developments in the vast field of complex network science and its applications.

  17. Phenomenological BRDF modeling for engineering applications

    Science.gov (United States)

    Jafolla, James C.; Stokes, Jeffrey A.; Sullivan, Robert J.

    1997-09-01

    The application of analytical light scattering techniques for virtual prototyping the optical performance of paint coatings provides an effective tool for optimizing paint design for specific optical requirements. This paper describes the phenomenological basis for the scattering coatings computer aided design (ScatCad) code. The ScatCad code predicts the bidirectional reflectance distribution function (BRDF) and the hemispherical directional reflectance (HDR) of pigmented paint coatings for the purpose of coating design optimization. The code uses techniques for computing the pigment single scattering phase function, multiple scattering radiative transfer, and rough surface scattering to calculate the BRDF and HDR based on the fundamental optical properties of the pigment(s) and binder, pigment number density and size distribution, and surface roughness of the binder-interface and substrate. This is a significant enhancement to the two- flux, Kubelka-Munk analysis that has traditionally been used in the coatings industry. Example calculations and comparison with measurements are also presented.

  18. Dual energy CT. Physical models and applications

    International Nuclear Information System (INIS)

    Sedlmair, Martin Ulrich

    2010-01-01

    Computer tomography (CT) is today a very important non-invasive imaging tool for medical diagnostics. Despite the non-negligible radiation doses of patients and medical personal certain diagnostic questions can only be answered using CT methods. Recent developments adding a second radiation source and a second detector (dual-source CT) allow the imaging the heart beat due to an improved acquisition time. Operation of the X-ray tubes with different voltages (dual-energy) and appropriate data processing methods allow extended information on the tissue composition, pathological structures and improved visualization of lesions. The contribution covers the basic physical background of this technology and is focused on applications, as for instance CT-guided angiography.

  19. Update on GOCART Model Development and Applications

    Science.gov (United States)

    Kim, Dongchul

    2013-01-01

    Recent results from the GOCART and GMI models are reported. They include: Updated emission inventories for anthropogenic and volcano sources, satellite-derived vegetation index for seasonal variations of dust emission, MODIS-derived smoke AOT for assessing uncertainties of biomass-burning emissions, long-range transport of aerosol across the Pacific Ocean, and model studies on the multi-decadal trend of regional and global aerosol distributions from 1980 to 2010, volcanic aerosols, and nitrate aerosols. The document was presented at the 2013 AEROCENTER Annual Meeting held at the GSFC Visitors Center, May 31, 2013. The Organizers of the meeting are posting the talks to the public Aerocentr website, after the meeting.

  20. A clinical application of the training model.

    Science.gov (United States)

    Gianotti, Patricia

    2010-03-01

    This article offers a perspective and a summary of Jack Danielian's (2010) Horneyan training model, highlighting the benefits of a meta-psychological approach for analysts in training and seasoned practitioners alike. To help illustrate the complexity of Karen Horney's views of character structure and character pathology, this article presents a model that reflects the dynamic tensions at play within individuals with narcissistic issues. It suggests that therapeutic listening can be tracked and that thematic material unfolds in a somewhat predictable, sequential, yet altogether systemic manner. Listening is not just art or intuition, nor is it merely interpretation of content based on a theoretical framework. It represents a way of holding the dialectic tension between conscious and unconscious, syntonic and dystonic. If we can better track these dynamic tensions, we can better anticipate and hopefully avoid clinical ruptures through the acting out of negative transference.

  1. Computational hemodynamics theory, modelling and applications

    CERN Document Server

    Tu, Jiyuan; Wong, Kelvin Kian Loong

    2015-01-01

    This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system.  Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...

  2. Fuzzy Stochastic Optimization Theory, Models and Applications

    CERN Document Server

    Wang, Shuming

    2012-01-01

    Covering in detail both theoretical and practical perspectives, this book is a self-contained and systematic depiction of current fuzzy stochastic optimization that deploys the fuzzy random variable as a core mathematical tool to model the integrated fuzzy random uncertainty. It proceeds in an orderly fashion from the requisite theoretical aspects of the fuzzy random variable to fuzzy stochastic optimization models and their real-life case studies.   The volume reflects the fact that randomness and fuzziness (or vagueness) are two major sources of uncertainty in the real world, with significant implications in a number of settings. In industrial engineering, management and economics, the chances are high that decision makers will be confronted with information that is simultaneously probabilistically uncertain and fuzzily imprecise, and optimization in the form of a decision must be made in an environment that is doubly uncertain, characterized by a co-occurrence of randomness and fuzziness. This book begins...

  3. Atmospheric dispersion models for environmental pollution applications

    International Nuclear Information System (INIS)

    Gifford, F.A.

    1976-01-01

    Pollutants are introduced into the air by many of man's activities. The potentially harmful effects these can cause are, broadly speaking, of two kinds: long-term, possibly large-scale and wide-spread chronic effects, including long-term effects on the earth's climate; and acute, short-term effects such as those associated with urban air pollution. This section is concerned with mathematical cloud or plume models describing the role of the atmosphere, primarily in relation to the second of these, the acute effects of air pollution, i.e., those arising from comparatively high concentration levels. The need for such air pollution modeling studies has increased spectacularly as a result of the National Environmental Policy Act of 1968 and, especially, two key court decisions; the Calvert Cliffs decision, and the Sierra Club ruling on environmental non-degradation

  4. Application of pyrolysis models in COCOSYS

    International Nuclear Information System (INIS)

    Klein-Hessling, W.; Roewekamp, M.; Allelein, H.J.

    2001-01-01

    For the assessment of the efficiency of severe accident management measures the simulation of severe accident development, progression and potential consequences in containments of nuclear power plants is required under conditions as realistic as possible. Therefore, the containment code item (COCOSYS) has been developed by GRS. The main objective is to provide a code system on the basis of mechanistic models for the comprehensive simulation of all relevant processes and plant states during severe accidents in the containment of light water reactors also covering the design basis accidents. In this context the simulation of oil and cable fires is of high priority. These processes strongly depend on the thermal hydraulic boundary conditions. An input-definition of the pyrolysis rate by the user is not consistent with the philosophy of COCOSYS. Therefore, a first attempt has been made for the code internal simulation of the pyrolysis rate and the following combustion process for oil and cable fires. The oil fire model used has been tested against the HDR E41.7 experiment. Because the cable fire model is still under development, a so-called 'simplified cable burning' model has been implemented in COCOSYS and tested against the HDR E42 cable fire experiments. Furthermore, in the frame of the bilateral (between German and Ukrainian government) project INT9131 in the field of fire safety at nuclear power plants (NPP), an exemplary fire hazard analysis (FHA) has been carried out for the cable spreading rooms below the unit control room of a VVER-1000/W-320 type reference plant. (authors)

  5. Generalized data stacking programming model with applications

    OpenAIRE

    Hala Samir Elhadidy; Rawya Yehia Rizk; Hassen Taher Dorrah

    2016-01-01

    Recent researches have shown that, everywhere in various sciences the systems are following stacked-based stored change behavior when subjected to events or varying environments “on and above” their normal situations. This paper presents a generalized data stack programming (GDSP) model which is developed to describe the system changes under varying environment. These changes which are captured with different ways such as sensor reading are stored in matrices. Extraction algorithm and identif...

  6. Application of Bayesian Model Selection for Metal Yield Models using ALEGRA and Dakota.

    Energy Technology Data Exchange (ETDEWEB)

    Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James; Swiler, Laura Painton

    2018-02-01

    This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.

  7. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  8. Functional Modelling for Fault Diagnosis and its application for NPP

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper presents functional modelling and its application for diagnosis in nuclear power plants.Functional modelling is defined and it is relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demon......The paper presents functional modelling and its application for diagnosis in nuclear power plants.Functional modelling is defined and it is relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed....... The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool the MFM Suite. MFM applications in nuclear power systems are described by two examples a PWR and a FBRreactor. The PWR example show how MFM can be used to model and reason about...

  9. An introduction to queueing theory modeling and analysis in applications

    CERN Document Server

    Bhat, U Narayan

    2015-01-01

    This introductory textbook is designed for a one-semester course on queueing theory that does not require a course on stochastic processes as a prerequisite. By integrating the necessary background on stochastic processes with the analysis of models, the work provides a sound foundational introduction to the modeling and analysis of queueing systems for a wide interdisciplinary audience of students in mathematics, statistics, and applied disciplines such as computer science, operations research, and engineering. This edition includes additional topics in methodology and applications. Key features: • An introductory chapter including a historical account of the growth of queueing theory in more than 100 years. • A modeling-based approach with emphasis on identification of models. • Rigorous treatment of the foundations of basic models commonly used in applications with appropriate references for advanced topics. • Applications in manufacturing and, computer and communication systems. • A chapter on ...

  10. Nonlinear Inertia Classification Model and Application

    Directory of Open Access Journals (Sweden)

    Mei Wang

    2014-01-01

    Full Text Available Classification model of support vector machine (SVM overcomes the problem of a big number of samples. But the kernel parameter and the punishment factor have great influence on the quality of SVM model. Particle swarm optimization (PSO is an evolutionary search algorithm based on the swarm intelligence, which is suitable for parameter optimization. Accordingly, a nonlinear inertia convergence classification model (NICCM is proposed after the nonlinear inertia convergence (NICPSO is developed in this paper. The velocity of NICPSO is firstly defined as the weighted velocity of the inertia PSO, and the inertia factor is selected to be a nonlinear function. NICPSO is used to optimize the kernel parameter and a punishment factor of SVM. Then, NICCM classifier is trained by using the optical punishment factor and the optical kernel parameter that comes from the optimal particle. Finally, NICCM is applied to the classification of the normal state and fault states of online power cable. It is experimentally proved that the iteration number for the proposed NICPSO to reach the optimal position decreases from 15 to 5 compared with PSO; the training duration is decreased by 0.0052 s and the recognition precision is increased by 4.12% compared with SVM.

  11. Modeling protein structures: construction and their applications.

    Science.gov (United States)

    Ring, C S; Cohen, F E

    1993-06-01

    Although no general solution to the protein folding problem exists, the three-dimensional structures of proteins are being successfully predicted when experimentally derived constraints are used in conjunction with heuristic methods. In the case of interleukin-4, mutagenesis data and CD spectroscopy were instrumental in the accurate assignment of secondary structure. In addition, the tertiary structure was highly constrained by six cysteines separated by many residues that formed three disulfide bridges. Although the correct structure was a member of a short list of plausible structures, the "best" structure was the topological enantiomer of the experimentally determined conformation. For many proteases, other experimentally derived structures can be used as templates to identify the secondary structure elements. In a procedure called modeling by homology, the structure of a known protein is used as a scaffold to predict the structure of another related protein. This method has been used to model a serine and a cysteine protease that are important in the schistosome and malarial life cycles, respectively. The model structures were then used to identify putative small molecule enzyme inhibitors computationally. Experiments confirm that some of these nonpeptidic compounds are active at concentrations of less than 10 microM.

  12. Applications of GARCH models to energy commodities

    Science.gov (United States)

    Humphreys, H. Brett

    This thesis uses GARCH methods to examine different aspects of the energy markets. The first part of the thesis examines seasonality in the variance. This study modifies the standard univariate GARCH models to test for seasonal components in both the constant and the persistence in natural gas, heating oil and soybeans. These commodities exhibit seasonal price movements and, therefore, may exhibit seasonal variances. In addition, the heating oil model is tested for a structural change in variance during the Gulf War. The results indicate the presence of an annual seasonal component in the persistence for all commodities. Out-of-sample volatility forecasting for natural gas outperforms standard forecasts. The second part of this thesis uses a multivariate GARCH model to examine volatility spillovers within the crude oil forward curve and between the London and New York crude oil futures markets. Using these results the effect of spillovers on dynamic hedging is examined. In addition, this research examines cointegration within the oil markets using investable returns rather than fixed prices. The results indicate the presence of strong volatility spillovers between both markets, weak spillovers from the front of the forward curve to the rest of the curve, and cointegration between the long term oil price on the two markets. The spillover dynamic hedge models lead to a marginal benefit in terms of variance reduction, but a substantial decrease in the variability of the dynamic hedge; thereby decreasing the transactions costs associated with the hedge. The final portion of the thesis uses portfolio theory to demonstrate how the energy mix consumed in the United States could be chosen given a national goal to reduce the risks to the domestic macroeconomy of unanticipated energy price shocks. An efficient portfolio frontier of U.S. energy consumption is constructed using a covariance matrix estimated with GARCH models. The results indicate that while the electric

  13. Model-Driven Approach for Body Area Network Application Development

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-05-01

    Full Text Available This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS. We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.

  14. Mobile Cloud Application Models Facilitated by the CPA†

    Directory of Open Access Journals (Sweden)

    Michael J. O’Sullivan

    2015-02-01

    Full Text Available This paper describes implementations of three mobile cloud applications, file synchronisation, intensive data processing, and group-based collaboration, using the Context Aware Mobile Cloud Services middleware, and the Cloud Personal Assistant. Both are part of the same mobile cloud project, actively developed and currently at the second version. We describe recent changes to the middleware, along with our experimental results of the three application models. We discuss challenges faced during the development of the middleware and their implications. The paper includes performance analysis of the CPA support for applications in respect to existing solutions where appropriate, and highlights the advantages of these applications with use-cases.

  15. Molecular modeling and multiscaling issues for electronic material applications

    CERN Document Server

    Iwamoto, Nancy; Yuen, Matthew; Fan, Haibo

    Volume 1 : Molecular Modeling and Multiscaling Issues for Electronic Material Applications provides a snapshot on the progression of molecular modeling in the electronics industry and how molecular modeling is currently being used to understand material performance to solve relevant issues in this field. This book is intended to introduce the reader to the evolving role of molecular modeling, especially seen through the eyes of the IEEE community involved in material modeling for electronic applications.  Part I presents  the role that quantum mechanics can play in performance prediction, such as properties dependent upon electronic structure, but also shows examples how molecular models may be used in performance diagnostics, especially when chemistry is part of the performance issue.  Part II gives examples of large-scale atomistic methods in material failure and shows several examples of transitioning between grain boundary simulations (on the atomistic level)and large-scale models including an example ...

  16. Management Model Applicable to Metallic Materials Industry

    Directory of Open Access Journals (Sweden)

    Adrian Ioana

    2013-02-01

    Full Text Available This paper presents an algorithmic analysis of the marketing mix in metallurgy. It also analyzes the main correlations and their optimizing possibilities through an efficient management. Thus, both the effect and the importance of the marketing mix, for components (the four “P-s” areanalyzed in the materials’ industry, but their correlations as well, with the goal to optimize the specific management. There are briefly presented the main correlations between the 4 marketing mix components (the 4 “P-s” for a product within the materials’ industry, including aspects regarding specific management.Keywords: Management Model, Materials Industry, Marketing Mix, Correlations.

  17. Applications of species distribution modeling to paleobiology

    DEFF Research Database (Denmark)

    Svenning, Jens-Christian; Fløjgaard, Camilla; Marske, Katharine Ann

    2011-01-01

    Species distribution modeling (SDM: statistical and/or mechanistic approaches to the assessment of range determinants and prediction of species occurrence) offers new possibilities for estimating and studying past organism distributions. SDM complements fossil and genetic evidence by providing (i...... the role of Pleistocene glacial refugia in biogeography and evolution, especially in Europe, but also in many other regions. SDM-based approaches are also beginning to contribute to a suite of other research questions, such as historical constraints on current distributions and diversity patterns, the end...

  18. Watershed modeling applications in south Texas

    Science.gov (United States)

    Pedraza, Diana E.; Ockerman, Darwin J.

    2012-01-01

    Watershed models can be used to simulate natural and human-altered processes including the flow of water and associated transport of sediment, chemicals, nutrients, and microbial organisms within a watershed. Simulation of these processes is useful for addressing a wide range of water-resource challenges, such as quantifying changes in water availability over time, understanding the effects of development and land-use changes on water resources, quantifying changes in constituent loads and yields over time, and quantifying aquifer recharge temporally and spatially throughout a watershed.

  19. Application of model search to lattice theory.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, M.; Wilkinson, K.; Mathematics and Computer Science

    2001-08-01

    We have used the first-order model-searching programs MACE and SEM to study various problems in lattice theory. First, we present a case study in which the two programs are used to examine the differences between the stages along the way from lattice theory to Boolean algebra. Second, we answer several questions posed by Norman Megill and Mladen Pavicic on ortholattices and orthomodular lattices. The questions from Megill and Pavicic arose in their study of quantum logics, which are being investigated in connection with proposed computing devices based on quantum mechanics. Previous questions of a similar nature were answered by McCune and MACE in [2].

  20. [Watershed water environment pollution models and their applications: a review].

    Science.gov (United States)

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  1. Application of the radtran 5 stop model

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, R.L.; Weiner, R.F.

    1998-01-01

    A number of environmental impact analyzes with the RADTRAN computer code have shown that dose to persons at stops is one of the largest components of incident-free dose during overland carriage of spent fuel and other radioactive materials. The input data used in these analyses were taken from a 1983 study that reports actual observations of spent fuel shipments by truck. Early RADTRAN stop models, however, were insufficiently flexible to take advantage of the detailed information in the study. A more recent study of gasoline service stations that specialize in servicing large trucks, which are the most likely stop locations for shipments of Type B packages in the United States, has provided additional, detailed data on refueling/meal stops. The RADTRAN 5 computer code for transportation risk analysis allows exposures at stops to be more fully modelled than have previous releases of the code and is able to take advantage of detailed data. It is the intent of this paper first to compare results from RADTRAN 4 and RADTRAN 5 for the old, low-resolution form of input data, and then to demonstrate what effect the new data and input format have on stop-dose estimates for an individual stop and for a hypothetical shipment route. Finally, these estimated public doses will be contrasted with doses calculated for a special population group-inspectors. (authors)

  2. Application of the RADTRAN 5 stop model

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, R.L.; Weiner, R.F.

    1997-01-01

    A number of environmental impact analyses with the RADTRAN computer code have shown that dose to persons at stops is one of the largest components of incident-free dose during overland carriage of spent fuel and other radioactive materials (e.g., USDOE, 1994). The input data used in these analyses were taken from a 1983 study that reports actual observations of spent fuel shipments by truck. Early RADTRAN stop models, however, were insufficiently flexible to take advantage of the detailed information in the study. A more recent study of gasoline service stations that specialize in servicing large trucks, which are the most likely stop locations for shipments of Type B packages in the United States, has provided additional, detailed data on refueling/meal stops. The RADTRAN 5 computer code for transportation risk analysis allows exposures at stops to be more fully modeled than have previous releases of the code and is able to take advantage of detailed data. It is the intent of this paper first to compare results from RADTRAN and RADTRAN 5 for the old, low-resolution form of input data, and then to demonstrate what effect the new data and input format have on stop-dose estimates for an individual stop and for a hypothetical shipment route. Finally, these estimated public doses will be contrasted with doses calculated for a special population group -- inspectors

  3. Simulation and Modeling Application in Agricultural Mechanization

    Directory of Open Access Journals (Sweden)

    R. M. Hudzari

    2012-01-01

    Full Text Available This experiment was conducted to determine the equations relating the Hue digital values of the fruits surface of the oil palm with maturity stage of the fruit in plantation. The FFB images were zoomed and captured using Nikon digital camera, and the calculation of Hue was determined using the highest frequency of the value for R, G, and B color components from histogram analysis software. New procedure in monitoring the image pixel value for oil palm fruit color surface in real-time growth maturity was developed. The estimation of day harvesting prediction was calculated based on developed model of relationships for Hue values with mesocarp oil content. The simulation model is regressed and predicts the day of harvesting or a number of days before harvest of FFB. The result from experimenting on mesocarp oil content can be used for real-time oil content determination of MPOB color meter. The graph to determine the day of harvesting the FFB was presented in this research. The oil was found to start developing in mesocarp fruit at 65 days before fruit at ripe maturity stage of 75% oil to dry mesocarp.

  4. A review of thermoelectric cooling: Materials, modeling and applications

    International Nuclear Information System (INIS)

    Zhao, Dongliang; Tan, Gang

    2014-01-01

    This study reviews the recent advances of thermoelectric materials, modeling approaches, and applications. Thermoelectric cooling systems have advantages over conventional cooling devices, including compact in size, light in weight, high reliability, no mechanical moving parts, no working fluid, being powered by direct current, and easily switching between cooling and heating modes. In this study, historical development of thermoelectric cooling has been briefly introduced first. Next, the development of thermoelectric materials has been given and the achievements in past decade have been summarized. To improve thermoelectric cooling system's performance, the modeling techniques have been described for both the thermoelement modeling and thermoelectric cooler (TEC) modeling including standard simplified energy equilibrium model, one-dimensional and three-dimensional models, and numerical compact model. Finally, the thermoelectric cooling applications have been reviewed in aspects of domestic refrigeration, electronic cooling, scientific application, and automobile air conditioning and seat temperature control, with summaries for the commercially available thermoelectric modules and thermoelectric refrigerators. It is expected that this study will be beneficial to thermoelectric cooling system design, simulation, and analysis. - Highlights: •Thermoelectric cooling has great prospects with thermoelectric material's advances. •Modeling techniques for both thermoelement and TEC have been reviewed. •Principle thermoelectric cooling applications have been reviewed and summarized

  5. Application of the numerical modelling techniques to the simulation ...

    African Journals Online (AJOL)

    The aquifer was modelled by the application of Finite Element Method (F.E.M), with appropriate initial and boundary conditions. The matrix solver technique adopted for the F.E.M. was that of the Conjugate Gradient Method. After the steady state calibration and transient verification, the model was used to predict the effect of ...

  6. Application of Multilevel Logistic Model to Identify Correlates of ...

    African Journals Online (AJOL)

    Implementation of multilevel model is becoming a common analytic technique over a wide range of disciplines including social and economic sciences. In this paper, an attempt has been made to assess the application of multilevel logistic model for the purpose of identifying the effect of household characteristics on poverty ...

  7. Application of Generic Disposal System Models

    Energy Technology Data Exchange (ETDEWEB)

    Mariner, Paul [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Glenn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stein, Emily [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This report describes specific GDSA activities in fiscal year 2015 (FY2015) toward the development of the enhanced disposal system modeling and analysis capability for geologic disposal of nuclear waste. The GDSA framework employs the PFLOTRAN thermal-hydrologic-chemical multi-physics code (Hammond et al., 2011) and the Dakota uncertainty sampling and propagation code (Adams et al., 2013). Each code is designed for massively-parallel processing in a high-performance computing (HPC) environment. Multi-physics representations in PFLOTRAN are used to simulate various coupled processes including heat flow, fluid flow, waste dissolution, radionuclide release, radionuclide decay and ingrowth, precipitation and dissolution of secondary phases, and radionuclide transport through the engineered barriers and natural geologic barriers to a well location in an overlying or underlying aquifer. Dakota is used to generate sets of representative realizations and to analyze parameter sensitivity.

  8. Generalized Linear Models with Applications in Engineering and the Sciences

    CERN Document Server

    Myers, Raymond H; Vining, G Geoffrey; Robinson, Timothy J

    2012-01-01

    Praise for the First Edition "The obvious enthusiasm of Myers, Montgomery, and Vining and their reliance on their many examples as a major focus of their pedagogy make Generalized Linear Models a joy to read. Every statistician working in any area of applied science should buy it and experience the excitement of these new approaches to familiar activities."-Technometrics Generalized Linear Models: With Applications in Engineering and the Sciences, Second Edition continues to provide a clear introduction to the theoretical foundations and key applications of generalized linear models (GLMs). Ma

  9. Solutions manual to accompany finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    A solutions manual to accompany Finite Mathematics: Models and Applications In order to emphasize the main concepts of each chapter, Finite Mathematics: Models and Applications features plentiful pedagogical elements throughout such as special exercises, end notes, hints, select solutions, biographies of key mathematicians, boxed key principles, a glossary of important terms and topics, and an overview of use of technology. The book encourages the modeling of linear programs and their solutions and uses common computer software programs such as LINDO. In addition to extensive chapters on pr

  10. Optimization of Process Parameters During Drilling of Glass-Fiber Polyester Reinforced Composites Using DOE and ANOVA

    Directory of Open Access Journals (Sweden)

    N.S. Mohan

    2010-09-01

    Full Text Available Polymer-based composite material possesses superior properties such as high strength-to-weight ratio, stiffness-to-weight ratio and good corrosive resistance and therefore, is attractive for high performance applications such as in aerospace, defense and sport goods industries. Drilling is one of the indispensable methods for building products with composite panels. Surface quality and dimensional accuracy play an important role in the performance of a machined component. In machining processes, however, the quality of the component is greatly influenced by the cutting conditions, tool geometry, tool material, machining process, chip formation, work piece material, tool wear and vibration during cutting. Drilling tests were conducted on glass fiber reinforced plastic composite [GFRP] laminates using an instrumented CNC milling center. A series of experiments are conducted using TRIAC VMC CNC machining center to correlate the cutting parameters and material parameters on the cutting thrust, torque and surface roughness. The measured results were collected and analyzed with the help of the commercial software packages MINITAB14 and Taly Profile. The surface roughness of the drilled holes was measured using Rank Taylor Hobson Surtronic 3+ instrument. The method could be useful in predicting thrust, torque and surface roughness parameters as a function of process variables. The main objective is to optimize the process parameters to achieve low cutting thrust, torque and good surface roughness. From the analysis it is evident that among all the significant parameters, speed and drill size have significant influence cutting thrust and drill size and specimen thickness on the torque and surface roughness. It was also found that feed rate does not have significant influence on the characteristic output of the drilling process.

  11. Elastic models application for thorax image registration

    International Nuclear Information System (INIS)

    Correa Prado, Lorena S; Diaz, E Andres Valdez; Romo, Raul

    2007-01-01

    This work consist of the implementation and evaluation of elastic alignment algorithms of biomedical images, which were taken at thorax level and simulated with the 4D NCAT digital phantom. Radial Basis Functions spatial transformations (RBF), a kind of spline, which allows carrying out not only global rigid deformations but also local elastic ones were applied, using a point-matching method. The applied functions were: Thin Plate Spline (TPS), Multiquadric (MQ) Gaussian and B-Spline, which were evaluated and compared by means of calculating the Target Registration Error and similarity measures between the registered images (the squared sum of intensity differences (SSD) and correlation coefficient (CC)). In order to value the user incurred error in the point-matching and segmentation tasks, two algorithms were also designed that calculate the Fiduciary Localization Error. TPS and MQ were demonstrated to have better performance than the others. It was proved RBF represent an adequate model for approximating the thorax deformable behaviour. Validation algorithms showed the user error was not significant

  12. Code Development for Control Design Applications: Phase I: Structural Modeling

    International Nuclear Information System (INIS)

    Bir, G. S.; Robinson, M.

    1998-01-01

    The design of integrated controls for a complex system like a wind turbine relies on a system model in an explicit format, e.g., state-space format. Current wind turbine codes focus on turbine simulation and not on system characterization, which is desired for controls design as well as applications like operating turbine model analysis, optimal design, and aeroelastic stability analysis. This paper reviews structural modeling that comprises three major steps: formation of component equations, assembly into system equations, and linearization

  13. Predictive modeling: potential application in prevention services.

    Science.gov (United States)

    Wilson, Moira L; Tumen, Sarah; Ota, Rissa; Simmers, Anthony G

    2015-05-01

    In 2012, the New Zealand Government announced a proposal to introduce predictive risk models (PRMs) to help professionals identify and assess children at risk of abuse or neglect as part of a preventive early intervention strategy, subject to further feasibility study and trialing. The purpose of this study is to examine technical feasibility and predictive validity of the proposal, focusing on a PRM that would draw on population-wide linked administrative data to identify newborn children who are at high priority for intensive preventive services. Data analysis was conducted in 2013 based on data collected in 2000-2012. A PRM was developed using data for children born in 2010 and externally validated for children born in 2007, examining outcomes to age 5 years. Performance of the PRM in predicting administratively recorded substantiations of maltreatment was good compared to the performance of other tools reviewed in the literature, both overall, and for indigenous Māori children. Some, but not all, of the children who go on to have recorded substantiations of maltreatment could be identified early using PRMs. PRMs should be considered as a potential complement to, rather than a replacement for, professional judgment. Trials are needed to establish whether risks can be mitigated and PRMs can make a positive contribution to frontline practice, engagement in preventive services, and outcomes for children. Deciding whether to proceed to trial requires balancing a range of considerations, including ethical and privacy risks and the risk of compounding surveillance bias. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  14. Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions.

    Science.gov (United States)

    Forstmann, B U; Ratcliff, R; Wagenmakers, E-J

    2016-01-01

    Sequential sampling models assume that people make speeded decisions by gradually accumulating noisy information until a threshold of evidence is reached. In cognitive science, one such model--the diffusion decision model--is now regularly used to decompose task performance into underlying processes such as the quality of information processing, response caution, and a priori bias. In the cognitive neurosciences, the diffusion decision model has recently been adopted as a quantitative tool to study the neural basis of decision making under time pressure. We present a selective overview of several recent applications and extensions of the diffusion decision model in the cognitive neurosciences.

  15. Solar radiation practical modeling for renewable energy applications

    CERN Document Server

    Myers, Daryl Ronald

    2013-01-01

    Written by a leading scientist with over 35 years of experience working at the National Renewable Energy Laboratory (NREL), Solar Radiation: Practical Modeling for Renewable Energy Applications brings together the most widely used, easily implemented concepts and models for estimating broadband and spectral solar radiation data. The author addresses various technical and practical questions about the accuracy of solar radiation measurements and modeling. While the focus is on engineering models and results, the book does review the fundamentals of solar radiation modeling and solar radiation m

  16. Top-Down Enterprise Application Integration with Reference Models

    Directory of Open Access Journals (Sweden)

    Willem-Jan van den Heuvel

    2000-11-01

    Full Text Available For Enterprise Resource Planning (ERP systems such as SAP R/3 or IBM SanFrancisco, the tailoring of reference models for customizing the ERP systems to specific organizational contexts is an established approach. In this paper, we present a methodology that uses such reference models as a starting point for a top-down integration of enterprise applications. The re-engineered models of legacy systems are individually linked via cross-mapping specifications to the forward-engineered reference model's specification. The actual linking of reference and legacy models is done with a methodology for connecting (new business objects with (old legacy systems.

  17. An investigation of modelling and design for software service applications.

    Science.gov (United States)

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  18. Nuclear model developments in FLUKA for present and future applications

    Directory of Open Access Journals (Sweden)

    Cerutti Francesco

    2017-01-01

    Full Text Available The FLUKAS code [1–3] is used in research laboratories all around the world for challenging applications spanning a very wide range of energies, projectiles and targets. FLUKAS is also extensively used for in hadrontherapy research studies and clinical planning systems. In this paper some of the recent developments in the FLUKAS nuclear physics models of relevance for very different application fields including medical physics are presented. A few examples are shown demonstrating the effectiveness of the upgraded code.

  19. Nuclear model developments in FLUKA for present and future applications

    Science.gov (United States)

    Cerutti, Francesco; Empl, Anton; Fedynitch, Anatoli; Ferrari, Alfredo; Ruben, GarciaAlia; Sala, Paola R.; Smirnov, George; Vlachoudis, Vasilis

    2017-09-01

    The FLUKAS code [1-3] is used in research laboratories all around the world for challenging applications spanning a very wide range of energies, projectiles and targets. FLUKAS is also extensively used for in hadrontherapy research studies and clinical planning systems. In this paper some of the recent developments in the FLUKAS nuclear physics models of relevance for very different application fields including medical physics are presented. A few examples are shown demonstrating the effectiveness of the upgraded code.

  20. Systems modeling and simulation applications for critical care medicine.

    Science.gov (United States)

    Dong, Yue; Chbat, Nicolas W; Gupta, Ashish; Hadzikadic, Mirsad; Gajic, Ognjen

    2012-06-15

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area.

  1. Algebraic Modeling of Topological and Computational Structures and Applications

    CERN Document Server

    Theodorou, Doros; Stefaneas, Petros; Kauffman, Louis

    2017-01-01

    This interdisciplinary book covers a wide range of subjects, from pure mathematics (knots, braids, homotopy theory, number theory) to more applied mathematics (cryptography, algebraic specification of algorithms, dynamical systems) and concrete applications (modeling of polymers and ionic liquids, video, music and medical imaging). The main mathematical focus throughout the book is on algebraic modeling with particular emphasis on braid groups. The research methods include algebraic modeling using topological structures, such as knots, 3-manifolds, classical homotopy groups, and braid groups. The applications address the simulation of polymer chains and ionic liquids, as well as the modeling of natural phenomena via topological surgery. The treatment of computational structures, including finite fields and cryptography, focuses on the development of novel techniques. These techniques can be applied to the design of algebraic specifications for systems modeling and verification. This book is the outcome of a w...

  2. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  3. Reviewing model application to support animal health decision making.

    Science.gov (United States)

    Singer, Alexander; Salman, Mo; Thulke, Hans-Hermann

    2011-04-01

    Animal health is of societal importance as it affects human welfare, and anthropogenic interests shape decision making to assure animal health. Scientific advice to support decision making is manifold. Modelling, as one piece of the scientific toolbox, is appreciated for its ability to describe and structure data, to give insight in complex processes and to predict future outcome. In this paper we study the application of scientific modelling to support practical animal health decisions. We reviewed the 35 animal health related scientific opinions adopted by the Animal Health and Animal Welfare Panel of the European Food Safety Authority (EFSA). Thirteen of these documents were based on the application of models. The review took two viewpoints, the decision maker's need and the modeller's approach. In the reviewed material three types of modelling questions were addressed by four specific model types. The correspondence between tasks and models underpinned the importance of the modelling question in triggering the modelling approach. End point quantifications were the dominating request from decision makers, implying that prediction of risk is a major need. However, due to knowledge gaps corresponding modelling studies often shed away from providing exact numbers. Instead, comparative scenario analyses were performed, furthering the understanding of the decision problem and effects of alternative management options. In conclusion, the most adequate scientific support for decision making - including available modelling capacity - might be expected if the required advice is clearly stated. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Hydraulic modeling development and application in water resources engineering

    Science.gov (United States)

    Simoes, Francisco J.; Yang, Chih Ted; Wang, Lawrence K.

    2015-01-01

    The use of modeling has become widespread in water resources engineering and science to study rivers, lakes, estuaries, and coastal regions. For example, computer models are commonly used to forecast anthropogenic effects on the environment, and to help provide advanced mitigation measures against catastrophic events such as natural and dam-break floods. Linking hydraulic models to vegetation and habitat models has expanded their use in multidisciplinary applications to the riparian corridor. Implementation of these models in software packages on personal desktop computers has made them accessible to the general engineering community, and their use has been popularized by the need of minimal training due to intuitive graphical user interface front ends. Models are, however, complex and nontrivial, to the extent that even common terminology is sometimes ambiguous and often applied incorrectly. In fact, many efforts are currently under way in order to standardize terminology and offer guidelines for good practice, but none has yet reached unanimous acceptance. This chapter provides a view of the elements involved in modeling surface flows for the application in environmental water resources engineering. It presents the concepts and steps necessary for rational model development and use by starting with the exploration of the ideas involved in defining a model. Tangible form of those ideas is provided by the development of a mathematical and corresponding numerical hydraulic model, which is given with a substantial amount of detail. The issues of model deployment in a practical and productive work environment are also addressed. The chapter ends by presenting a few model applications highlighting the need for good quality control in model validation.

  5. A complex autoregressive model and application to monthly temperature forecasts

    Directory of Open Access Journals (Sweden)

    X. Gu

    2005-11-01

    Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.

  6. Database application for changing data models in environmental engineering

    Energy Technology Data Exchange (ETDEWEB)

    Hussels, Ulrich; Camarinopoulos, Stephanos; Luedtke, Torsten; Pampoukis, Georgios [RISA Sicherheitsanalysen GmbH, Berlin-Charlottenburg (Germany)

    2013-07-01

    Whenever a technical task is to be solved with the help of a database application and uncertainties regarding the structure, scope or level of detail of the data model exist (either currently or in the future) the use of a generic database application can reduce considerably the cost of implementation and maintenance. Simultaneously the approach described in this contribution permits the operation with different views on the data and even finding and defining new views which had not been considered before. The prerequisite for this is that the preliminary information (structure as well as data) stored into the generic application matches the intended use. In this case, parts of the generic model developed with the generic approach can be reused and according efforts for a major rebuild can be saved. This significantly reduces the development time. At the same time flexibility is achieved concerning the environmental data model, which is not given in the context of conventional developments. (orig.)

  7. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  8. Applications of Nonlinear Dynamics Model and Design of Complex Systems

    CERN Document Server

    In, Visarath; Palacios, Antonio

    2009-01-01

    This edited book is aimed at interdisciplinary, device-oriented, applications of nonlinear science theory and methods in complex systems. In particular, applications directed to nonlinear phenomena with space and time characteristics. Examples include: complex networks of magnetic sensor systems, coupled nano-mechanical oscillators, nano-detectors, microscale devices, stochastic resonance in multi-dimensional chaotic systems, biosensors, and stochastic signal quantization. "applications of nonlinear dynamics: model and design of complex systems" brings together the work of scientists and engineers that are applying ideas and methods from nonlinear dynamics to design and fabricate complex systems.

  9. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  10. Klaim-DB: A Modeling Language for Distributed Database Applications

    DEFF Research Database (Denmark)

    Wu, Xi; Li, Ximeng; Lluch Lafuente, Alberto

    2015-01-01

    We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access and manip......We present the modelling language, Klaim-DB, for distributed database applications. Klaim-DB borrows the distributed nets of the coordination language Klaim but essentially re-incarnates the tuple spaces of Klaim as databases, and provides high-level language abstractions for the access...

  11. Animal models of enterovirus 71 infection: applications and limitations

    Science.gov (United States)

    2014-01-01

    Human enterovirus 71 (EV71) has emerged as a neuroinvasive virus that is responsible for several outbreaks in the Asia-Pacific region over the past 15 years. Appropriate animal models are needed to understand EV71 neuropathogenesis better and to facilitate the development of effective vaccines and drugs. Non-human primate models have been used to characterize and evaluate the neurovirulence of EV71 after the early outbreaks in late 1990s. However, these models were not suitable for assessing the neurovirulence level of the virus and were associated with ethical and economic difficulties in terms of broad application. Several strategies have been applied to develop mouse models of EV71 infection, including strategies that employ virus adaption and immunodeficient hosts. Although these mouse models do not closely mimic human disease, they have been applied to determine the pathogenesis of and treatment and prevention of the disease. EV71 receptor-transgenic mouse models have recently been developed and have significantly advanced our understanding of the biological features of the virus and the host-parasite interactions. Overall, each of these models has advantages and disadvantages, and these models are differentially suited for studies of EV71 pathogenesis and/or the pre-clinical testing of antiviral drugs and vaccines. In this paper, we review the characteristics, applications and limitation of these EV71 animal models, including non-human primate and mouse models. PMID:24742252

  12. Modelling of a Hybrid Energy System for Autonomous Application

    Directory of Open Access Journals (Sweden)

    Yang He

    2013-10-01

    Full Text Available A hybrid energy system (HES is a trending power supply solution for autonomous devices. With the help of an accurate system model, the HES development will be efficient and oriented. In spite of various precise unit models, a HES system is hardly developed. This paper proposes a system modelling approach, which applies the power flux conservation as the governing equation and adapts and modifies unit models of solar cells, piezoelectric generators, a Li-ion battery and a super-capacitor. A generalized power harvest, storage and management strategy is also suggested to adapt to various application scenarios.

  13. Electromagnetic Modelling of MMIC CPWs for High Frequency Applications

    Science.gov (United States)

    Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.

    2018-02-01

    Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.

  14. Constitutive Modeling of Geomaterials Advances and New Applications

    CERN Document Server

    Zhang, Jian-Min; Zheng, Hong; Yao, Yangping

    2013-01-01

    The Second International Symposium on Constitutive Modeling of Geomaterials: Advances and New Applications (IS-Model 2012), is to be held in Beijing, China, during October 15-16, 2012. The symposium is organized by Tsinghua University, the International Association for Computer Methods and Advances in Geomechanics (IACMAG), the Committee of Numerical and Physical Modeling of Rock Mass, Chinese Society for Rock Mechanics and Engineering, and the Committee of Constitutive Relations and Strength Theory, China Institution of Soil Mechanics and Geotechnical Engineering, China Civil Engineering Society. This Symposium follows the first successful International Workshop on Constitutive Modeling held in Hong Kong, which was organized by Prof. JH Yin in 2007.   Constitutive modeling of geomaterials has been an active research area for a long period of time. Different approaches have been used in the development of various constitutive models. A number of models have been implemented in the numerical analyses of geote...

  15. Mathematical and numerical foundations of turbulence models and applications

    CERN Document Server

    Chacón Rebollo, Tomás

    2014-01-01

    With applications to climate, technology, and industry, the modeling and numerical simulation of turbulent flows are rich with history and modern relevance. The complexity of the problems that arise in the study of turbulence requires tools from various scientific disciplines, including mathematics, physics, engineering, and computer science. Authored by two experts in the area with a long history of collaboration, this monograph provides a current, detailed look at several turbulence models from both the theoretical and numerical perspectives. The k-epsilon, large-eddy simulation, and other models are rigorously derived and their performance is analyzed using benchmark simulations for real-world turbulent flows. Mathematical and Numerical Foundations of Turbulence Models and Applications is an ideal reference for students in applied mathematics and engineering, as well as researchers in mathematical and numerical fluid dynamics. It is also a valuable resource for advanced graduate students in fluid dynamics,...

  16. Distributionally Robust Return-Risk Optimization Models and Their Applications

    Directory of Open Access Journals (Sweden)

    Li Yang

    2014-01-01

    Full Text Available Based on the risk control of conditional value-at-risk, distributionally robust return-risk optimization models with box constraints of random vector are proposed. They describe uncertainty in both the distribution form and moments (mean and covariance matrix of random vector. It is difficult to solve them directly. Using the conic duality theory and the minimax theorem, the models are reformulated as semidefinite programming problems, which can be solved by interior point algorithms in polynomial time. An important theoretical basis is therefore provided for applications of the models. Moreover, an application of the models to a practical example of portfolio selection is considered, and the example is evaluated using a historical data set of four stocks. Numerical results show that proposed methods are robust and the investment strategy is safe.

  17. Application of a procedure oriented crew model to modelling nuclear plant operation

    International Nuclear Information System (INIS)

    Baron, S.

    1986-01-01

    PROCRU (PROCEDURE-ORIENTED CREW MODEL) is a model developed to analyze flight crew procedures in a commercial ILS approach-to-landing. The model builds on earlier, validated control-theoretic models for human estimation and control behavior, but incorporates features appropriate to analyzing supervisory control in multi-task environments. In this paper, the basic ideas underlying the PROCRU model, and the generalization of these ideas to provide a supervisory control model of wider applicability are discussed. The potential application of this supervisory control model to nuclear power plant operations is considered. The range of problems that can be addressed, the kinds of data that will be needed and the nature of the results that might be expected from such an application are indicated

  18. Applications of system dynamics modelling to support health policy.

    Science.gov (United States)

    Atkinson, Jo-An M; Wells, Robert; Page, Andrew; Dominello, Amanda; Haines, Mary; Wilson, Andrew

    2015-07-09

    The value of systems science modelling methods in the health sector is increasingly being recognised. Of particular promise is the potential of these methods to improve operational aspects of healthcare capacity and delivery, analyse policy options for health system reform and guide investments to address complex public health problems. Because it lends itself to a participatory approach, system dynamics modelling has been a particularly appealing method that aims to align stakeholder understanding of the underlying causes of a problem and achieve consensus for action. The aim of this review is to determine the effectiveness of system dynamics modelling for health policy, and explore the range and nature of its application. A systematic search was conducted to identify articles published up to April 2015 from the PubMed, Web of Knowledge, Embase, ScienceDirect and Google Scholar databases. The grey literature was also searched. Papers eligible for inclusion were those that described applications of system dynamics modelling to support health policy at any level of government. Six papers were identified, comprising eight case studies of the application of system dynamics modelling to support health policy. No analytic studies were found that examined the effectiveness of this type of modelling. Only three examples engaged multidisciplinary stakeholders in collective model building. Stakeholder participation in model building reportedly facilitated development of a common 'mental map' of the health problem, resulting in consensus about optimal policy strategy and garnering support for collaborative action. The paucity of relevant papers indicates that, although the volume of descriptive literature advocating the value of system dynamics modelling is considerable, its practical application to inform health policy making is yet to be routinely applied and rigorously evaluated. Advances in software are allowing the participatory model building approach to be extended to

  19. Applications of spatial statistical network models to stream data

    Science.gov (United States)

    Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.

  20. How Participatory Should Environmental Governance Be? Testing the Applicability of the Vroom-Yetton-Jago Model in Public Environmental Decision-Making

    Science.gov (United States)

    Lührs, Nikolas; Jager, Nicolas W.; Challies, Edward; Newig, Jens

    2018-02-01

    Public participation is potentially useful to improve public environmental decision-making and management processes. In corporate management, the Vroom-Yetton-Jago normative decision-making model has served as a tool to help managers choose appropriate degrees of subordinate participation for effective decision-making given varying decision-making contexts. But does the model recommend participatory mechanisms that would actually benefit environmental management? This study empirically tests the improved Vroom-Jago version of the model in the public environmental decision-making context. To this end, the key variables of the Vroom-Jago model are operationalized and adapted to a public environmental governance context. The model is tested using data from a meta-analysis of 241 published cases of public environmental decision-making, yielding three main sets of findings: (1) The Vroom-Jago model proves limited in its applicability to public environmental governance due to limited variance in its recommendations. We show that adjustments to key model equations make it more likely to produce meaningful recommendations. (2) We find that in most of the studied cases, public environmental managers (implicitly) employ levels of participation close to those that would have been recommended by the model. (3) An ANOVA revealed that such cases, which conform to model recommendations, generally perform better on stakeholder acceptance and environmental standards of outputs than those that diverge from the model. Public environmental management thus benefits from carefully selected and context-sensitive modes of participation.

  1. Nonlinear Mathematical Modeling in Pneumatic Servo Position Applications

    Directory of Open Access Journals (Sweden)

    Antonio Carlos Valdiero

    2011-01-01

    Full Text Available This paper addresses a new methodology for servo pneumatic actuators mathematical modeling and selection from the dynamic behavior study in engineering applications. The pneumatic actuator is very common in industrial application because it has the following advantages: its maintenance is easy and simple, with relatively low cost, self-cooling properties, good power density (power/dimension rate, fast acting with high accelerations, and installation flexibility. The proposed fifth-order nonlinear mathematical model represents the main characteristics of this nonlinear dynamic system, as servo valve dead zone, air flow-pressure relationship through valve orifice, air compressibility, and friction effects between contact surfaces in actuator seals. Simulation results show the dynamic performance for different pneumatic cylinders in order to see which features contribute to a better behavior of the system. The knowledge of this behavior allows an appropriate choice of pneumatic actuator, mainly contributing to the success of their precise control in several applications.

  2. Application of the rainfall infiltration breakthrough (RIB) model for ...

    African Journals Online (AJOL)

    Application of the rainfall infiltration breakthrough (RIB) model for groundwater recharge estimation in west coastal South Africa. ... the data from Oudebosch with different rainfall and groundwater abstraction inputs are simulated to explore individual effects on water levels as well as recharge rate estimated on a daily basis.

  3. The Applicability of Selected Evaluation Models to Evolving Investigative Designs.

    Science.gov (United States)

    Smith, Nick L.; Hauer, Diane M.

    1990-01-01

    Ten evaluation models are examined in terms of their applicability to investigative, emergent design programs: Stake's portrayal, Wolf's adversary, Patton's utilization, Guba's investigative journalism, Scriven's goal-free, Scriven's modus operandi, Eisner's connoisseurial, Stufflebeam's CIPP, Tyler's objective based, and Levin's cost…

  4. Credibilistic programming an introduction to models and applications

    CERN Document Server

    2014-01-01

    It provides fuzzy programming approach to solve real-life decision problems in fuzzy environment. Within the framework of credibility theory, it provides a self-contained, comprehensive and up-to-date presentation of fuzzy programming models, algorithms and applications in portfolio analysis.

  5. Modelling primate control of grasping for robotics applications

    CSIR Research Space (South Africa)

    Kleinhans, A

    2014-09-01

    Full Text Available -1 European Conference on Computer Vision (ECCV) Workshops, Zurich, Switzerland, 7 September 2014 Modelling primate control of grasping for robotics applications Ashley Kleinhans1, Serge Thill2, Benjamin Rosman1, Renaud Detry3 & Bryan Tripp4 1 CSIR...

  6. Application of multilinear regression analysis in modeling of soil ...

    African Journals Online (AJOL)

    The application of Multi-Linear Regression Analysis (MLRA) model for predicting soil properties in Calabar South offers a technical guide and solution in foundation designs problems in the area. Forty-five soil samples were collected from fifteen different boreholes at a different depth and 270 tests were carried out for CBR, ...

  7. Application of Logic Models in a Large Scientific Research Program

    Science.gov (United States)

    O'Keefe, Christine M.; Head, Richard J.

    2011-01-01

    It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

  8. Application of Markovian model to school enrolment projection ...

    African Journals Online (AJOL)

    Application of Markovian model to school enrolment projection process. VU Ekhosuehi, AA Osagiede. Abstract. No Abstract. Global Journal of Mathematical Sciences Vol. 5(1) 2006: 9-16. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  9. WEPP Model applications for evaluations of best management practices

    Science.gov (United States)

    D. C. Flanagan; W. J. Elliott; J. R. Frankenberger; C. Huang

    2010-01-01

    The Water Erosion Prediction Project (WEPP) model is a process-based erosion prediction technology for application to small watersheds and hillslope profiles, under agricultural, forested, rangeland, and other land management conditions. Developed by the United States Department of Agriculture (USDA) over the past 25 years, WEPP simulates many of the physical processes...

  10. Mathematical annuity models application in cash flow analysis ...

    African Journals Online (AJOL)

    Mathematical annuity models application in cash flow analysis. ... We also compare the cost efficiency between Amortisation and Sinking fund loan repayment as prevalent in financial institutions. Keywords: Annuity, Amortisation, Sinking Fund, Present and Future Value Annuity, Maturity date and Redemption value.

  11. An application of artificial intelligence for rainfall–runoff modeling

    Indian Academy of Sciences (India)

    This study proposes an application of two techniques of artificial intelligence (AI) for rainfall–runoff modeling: the artificial neural networks (ANN) and .... conventional mathematical analysis does not, or cannot, provide analytical solutions, .... very simple where there exist one-to-one relation- ships between the symbols of the ...

  12. Application of wildfire simulation models for risk analysis

    Science.gov (United States)

    Alan A. Ager; Mark A. Finney

    2009-01-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of...

  13. Modelling for Bio-,Agro- and Pharma-Applications

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Singh, Ravendra; Cameron, Ian

    2011-01-01

    such as mixers, fermenter as well as air compression and filtration. Milk pasteurisation is another application considered in this chapter. The intention is to look at the temperature profile of milk through the process, which has 4 distinct phases. Other case studies in this chapter include a dynamic model...

  14. An Application Of Receptor Modeling To Identify Airborne Particulate ...

    African Journals Online (AJOL)

    An Application Of Receptor Modeling To Identify Airborne Particulate Sources In Lagos, Nigeria. FS Olise, OK Owoade, HB Olaniyi. Abstract. There have been no clear demarcations between industrial and residential areas of Lagos with focus on industry as the major source. There is need to identify potential source types in ...

  15. An application of artificial intelligence for rainfall–runoff modeling

    Indian Academy of Sciences (India)

    This study proposes an application of two techniques of artificial intelligence (AI) for rainfall–runoff modeling: the artificial neural networks (ANN) and the evolutionary computation (EC). Two diff- erent ANN techniques, the feed forward back propagation (FFBP) and generalized regression neural network (GRNN) methods ...

  16. application of multilinear regression analysis in modeling of soil

    African Journals Online (AJOL)

    Windows User

    APPLICATION OF MULTILINEAR REGRESSION ANALYSIS IN MODELING OF. SOIL PROPERTIES FOR GEOTECHNICAL CIVIL ENGINEERING WORKS. IN CALABAR SOUTH. J. G. Egbe1, D. E. Ewa2, S. E. Ubi3, G. B. Ikwa4 and O. O. Tumenayo5. 1, 2, 3, 4, DEPT. OF CIVIL ENGINEERING, CROSS RIVER UNIV.

  17. Application of a stochastic modelling framework to characterize the ...

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 36; Issue 4. Application of a stochastic modelling framework to characterize the influence of different oxide scales on the solid particle erosion behaviour of boiler grade steel. S K Das. Volume 36 Issue 4 August 2011 pp 425-440 ...

  18. Microwave applicator for hyperthermia treatment on in vivo melanoma model

    Czech Academy of Sciences Publication Activity Database

    Togni, P.; Vrba, J.; Vannucci, Luca

    2010-01-01

    Roč. 48, č. 3 (2010), s. 285-292 ISSN 0140-0118 R&D Projects: GA AV ČR(CZ) IAA500200510 Institutional research plan: CEZ:AV0Z50200510 Keywords : Melanoma in vivo model * Superficial hyperthermia * Microwave applicator Subject RIV: EC - Immunology Impact factor: 1.791, year: 2010

  19. A Case Study Application Of Time Study Model In Paint ...

    African Journals Online (AJOL)

    This paper presents a case study in the development and application of a time study model in a paint manufacturing company. The organization specializes in the production of different grades of paint and paint containers. The paint production activities include; weighing of raw materials, drying of raw materials, dissolving ...

  20. Crop model usefulness in drylands of southern Africa: an application ...

    African Journals Online (AJOL)

    Data limitations in southern Africa frequently hinder adequate assessment of crop models before application. ... three locations to represent varying cropping and physical conditions in southern Africa, i.e. maize and sorghum (Mohale's Hoek, Lesotho and Big Bend, Swaziland) and maize and groundnut (Lilongwe, Malawi).

  1. A framework for development and application of hydrological models

    Directory of Open Access Journals (Sweden)

    T. Wagener

    2001-01-01

    Full Text Available Many existing hydrological modelling procedures do not make best use of available information, resulting in non-minimal uncertainties in model structure and parameters, and a lack of detailed information regarding model behaviour. A framework is required that balances the level of model complexity supported by the available data with the level of performance suitable for the desired application. Tools are needed that make optimal use of the information available in the data to identify model structure and parameters, and that allow a detailed analysis of model behaviour. This should result in appropriate levels of model complexity as a function of available data, hydrological system characteristics and modelling purpose. This paper introduces an analytical framework to achieve this, and tools to use within it, based on a multi-objective approach to model calibration and analysis. The utility of the framework is demonstrated with an example from the field of rainfall-runoff modelling. Keywords: hydrological modelling, multi-objective calibration, model complexity, parameter identifiability

  2. Development and Application of Nonlinear Land-Use Regression Models

    Science.gov (United States)

    Champendal, Alexandre; Kanevski, Mikhail; Huguenot, Pierre-Emmanuel

    2014-05-01

    The problem of air pollution modelling in urban zones is of great importance both from scientific and applied points of view. At present there are several fundamental approaches either based on science-based modelling (air pollution dispersion) or on the application of space-time geostatistical methods (e.g. family of kriging models or conditional stochastic simulations). Recently, there were important developments in so-called Land Use Regression (LUR) models. These models take into account geospatial information (e.g. traffic network, sources of pollution, average traffic, population census, land use, etc.) at different scales, for example, using buffering operations. Usually the dimension of the input space (number of independent variables) is within the range of (10-100). It was shown that LUR models have some potential to model complex and highly variable patterns of air pollution in urban zones. Most of LUR models currently used are linear models. In the present research the nonlinear LUR models are developed and applied for Geneva city. Mainly two nonlinear data-driven models were elaborated: multilayer perceptron and random forest. An important part of the research deals also with a comprehensive exploratory data analysis using statistical, geostatistical and time series tools. Unsupervised self-organizing maps were applied to better understand space-time patterns of the pollution. The real data case study deals with spatial-temporal air pollution data of Geneva (2002-2011). Nitrogen dioxide (NO2) has caught our attention. It has effects on human health and on plants; NO2 contributes to the phenomenon of acid rain. The negative effects of nitrogen dioxides on plants are the reduction of the growth, production and pesticide resistance. And finally, the effects on materials: nitrogen dioxide increases the corrosion. The data used for this study consist of a set of 106 NO2 passive sensors. 80 were used to build the models and the remaining 36 have constituted

  3. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Hsu, F.; Subudhi, M.

    1991-01-01

    This paper describes a modeling approach to analyze light water reactor component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends

  4. Degradation modeling with application to aging and maintenance effectiveness evaluations

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hsu, F.; Subduhi, M.; Vesely, W.E.

    1990-01-01

    This paper describes a modeling approach to analyze component degradation and failure data to understand the aging process of components. As used here, degradation modeling is the analysis of information on component degradation in order to develop models of the process and its implications. This particular modeling focuses on the analysis of the times of component degradations, to model how the rate of degradation changes with the age of the component. The methodology presented also discusses the effectiveness of maintenance as applicable to aging evaluations. The specific applications which are performed show quantitative models of component degradation rates and component failure rates from plant-specific data. The statistical techniques which are developed and applied allow aging trends to be effectively identified in the degradation data, and in the failure data. Initial estimates of the effectiveness of maintenance in limiting degradations from becoming failures also are developed. These results are important first steps in degradation modeling, and show that degradation can be modeled to identify aging trends. 2 refs., 8 figs

  5. Semantic Model Driven Architecture Based Method for Enterprise Application Development

    Science.gov (United States)

    Wu, Minghui; Ying, Jing; Yan, Hui

    Enterprise applications have the requirements of meeting dynamic businesses processes and adopting lasted technologies flexibly, with to solve the problems caused by the nature of heterogeneous characteristic. Service-Oriented Architecture (SOA) is becoming a leading paradigm for business process integration. This research work focuses on business process modeling, proposes a semantic model-driven development method named SMDA combined with the Ontology and Model-Driven Architecture (MDA) technologies. The architecture of SMDA is presented in three orthogonal perspectives. (1) Vertical axis is the MDA 4 layers, the focus is UML profiles in M2 (meta-model layer) for ontology modeling, and three abstract levels: CIM, PIM and PSM modeling respectively. (2) Horizontal axis is different concerns involved in the development: Process, Application, Information, Organization, and Technology. (3) Traversal Axis is referred to aspects that have influence on other models of the cross-cutting axis: Architecture, Semantics, Aspect, and Pattern. The paper also introduces the modeling and transformation process in SMDA, and describes dynamic service composition supports briefly.

  6. FUNCTIONAL MODELLING FOR FAULT DIAGNOSIS AND ITS APPLICATION FOR NPP

    Directory of Open Access Journals (Sweden)

    MORTEN LIND

    2014-12-01

    Full Text Available The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM, which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  7. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  8. Application of Simple CFD Models in Smoke Ventilation Design

    DEFF Research Database (Denmark)

    Brohus, Henrik; Nielsen, Peter Vilhelm; la Cour-Harbo, Hans

    2004-01-01

    The paper examines the possibilities of using simple CFD models in practical smoke ventilation design. The aim is to assess if it is possible with a reasonable accuracy to predict the behaviour of smoke transport in case of a fire. A CFD code mainly applicable for “ordinary” ventilation design...... uses a standard k-ε turbulence model. Simulations comprise both steady-state and dynamic approaches. Several boundary conditions are tested. Finally, the paper discusses the prospects of simple CFD models in smoke ventilation design including the inherent limitations....

  9. Modelling of a cross flow evaporator for CSP application

    DEFF Research Database (Denmark)

    Sørensen, Kim; Franco, Alessandro; Pelagotti, Leonardo

    2016-01-01

    ) applications. Heat transfer and pressure drop prediction methods are an important tool for design and modelling of diabatic, two-phase, shell-side flow over a horizontal plain tubes bundle for a vertical up-flow evaporator. With the objective of developing a model for a specific type of cross flow evaporator...... the available correlations for the definition of two-phase flow heat transfer, void fraction and pressure drop in connection with the operation of steam generators, focuses attention on a comparison of the results obtained using several different models resulting by different combination of correlations...

  10. Modelling and simulation of diffusive processes methods and applications

    CERN Document Server

    Basu, SK

    2014-01-01

    This book addresses the key issues in the modeling and simulation of diffusive processes from a wide spectrum of different applications across a broad range of disciplines. Features: discusses diffusion and molecular transport in living cells and suspended sediment in open channels; examines the modeling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modeling of nitrogen fate and transport

  11. Costs equations for cost modeling: application of ABC Matrix

    Directory of Open Access Journals (Sweden)

    Alex Fabiano Bertollo Santana

    2016-03-01

    Full Text Available This article aimed at providing an application of the ABC Matrix model - a management tool that models processes and activities. The ABC Matrix is based on matrix multiplication, using a fast algorithm for the development of costing systems and the subsequent translation of the costs in cost equations and systems. The research methodology is classified as a case study, using the simulation data to validate the model. The conclusion of the research is that the algorithm presented is an important development, because it is an effective approach to calculating the product cost and because it provides simple and flexible algorithm design software for controlling the cost of products

  12. Statistical modelling for recurrent events: an application to sports injuries.

    Science.gov (United States)

    Ullah, Shahid; Gabbett, Tim J; Finch, Caroline F

    2014-09-01

    Injuries are often recurrent, with subsequent injuries influenced by previous occurrences and hence correlation between events needs to be taken into account when analysing such data. This paper compares five different survival models (Cox proportional hazards (CoxPH) model and the following generalisations to recurrent event data: Andersen-Gill (A-G), frailty, Wei-Lin-Weissfeld total time (WLW-TT) marginal, Prentice-Williams-Peterson gap time (PWP-GT) conditional models) for the analysis of recurrent injury data. Empirical evaluation and comparison of different models were performed using model selection criteria and goodness-of-fit statistics. Simulation studies assessed the size and power of each model fit. The modelling approach is demonstrated through direct application to Australian National Rugby League recurrent injury data collected over the 2008 playing season. Of the 35 players analysed, 14 (40%) players had more than 1 injury and 47 contact injuries were sustained over 29 matches. The CoxPH model provided the poorest fit to the recurrent sports injury data. The fit was improved with the A-G and frailty models, compared to WLW-TT and PWP-GT models. Despite little difference in model fit between the A-G and frailty models, in the interest of fewer statistical assumptions it is recommended that, where relevant, future studies involving modelling of recurrent sports injury data use the frailty model in preference to the CoxPH model or its other generalisations. The paper provides a rationale for future statistical modelling approaches for recurrent sports injury. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Genome Editing and Its Applications in Model Organisms

    Directory of Open Access Journals (Sweden)

    Dongyuan Ma

    2015-12-01

    Full Text Available Technological advances are important for innovative biological research. Development of molecular tools for DNA manipulation, such as zinc finger nucleases (ZFNs, transcription activator-like effector nucleases (TALENs, and the clustered regularly-interspaced short palindromic repeat (CRISPR/CRISPR-associated (Cas, has revolutionized genome editing. These approaches can be used to develop potential therapeutic strategies to effectively treat heritable diseases. In the last few years, substantial progress has been made in CRISPR/Cas technology, including technical improvements and wide application in many model systems. This review describes recent advancements in genome editing with a particular focus on CRISPR/Cas, covering the underlying principles, technological optimization, and its application in zebrafish and other model organisms, disease modeling, and gene therapy used for personalized medicine.

  14. Genome Editing and Its Applications in Model Organisms.

    Science.gov (United States)

    Ma, Dongyuan; Liu, Feng

    2015-12-01

    Technological advances are important for innovative biological research. Development of molecular tools for DNA manipulation, such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly-interspaced short palindromic repeat (CRISPR)/CRISPR-associated (Cas), has revolutionized genome editing. These approaches can be used to develop potential therapeutic strategies to effectively treat heritable diseases. In the last few years, substantial progress has been made in CRISPR/Cas technology, including technical improvements and wide application in many model systems. This review describes recent advancements in genome editing with a particular focus on CRISPR/Cas, covering the underlying principles, technological optimization, and its application in zebrafish and other model organisms, disease modeling, and gene therapy used for personalized medicine. Copyright © 2016 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  15. Semantic Information Modeling for Emerging Applications in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Natarajan, Sreedhar; Simmhan, Yogesh; Prasanna, Viktor

    2012-04-16

    Smart Grid modernizes power grid by integrating digital and information technologies. Millions of smart meters, intelligent appliances and communication infrastructures are under deployment allowing advanced IT applications to be developed to secure and manage power grid operations. Demand response (DR) is one such emerging application to optimize electricity demand by curtailing/shifting power load when peak load occurs. Existing DR approaches are mostly based on static plans such as pricing policies and load shedding schedules. However, improvements to power management applications rely on data emanating from existing and new information sources with the growth of Smart Grid information space. In particular, dynamic DR algorithms depend on information from smart meters that report interval-based power consumption measurement, HVAC systems that monitor buildings heat and humidity, and even weather forecast services. In order for emerging Smart Grid applications to take advantage of the diverse data influx, extensible information integration is required. In this paper, we develop an integrated Smart Grid information model using Semantic Web techniques and present case studies of using semantic information for dynamic DR. We show the semantic model facilitates information integration and knowledge representation for developing the next generation Smart Grid applications.

  16. Challenges of Microgrids in Remote Communities: A STEEP Model Application

    Directory of Open Access Journals (Sweden)

    Daniel Akinyele

    2018-02-01

    Full Text Available There is a growing interest in the application of microgrids around the world because of their potential for achieving a flexible, reliable, efficient and smart electrical grid system and supplying energy to off-grid communities, including their economic benefits. Several research studies have examined the application issues of microgrids. However, a lack of in-depth considerations for the enabling planning conditions has been identified as a major reason why microgrids fail in several off-grid communities. This development requires research efforts that consider better strategies and framework for sustainable microgrids in remote communities. This paper first presents a comprehensive review of microgrid technologies and their applications. It then proposes the STEEP model to examine critically the failure factors based on the social, technical, economic, environmental and policy (STEEP perspectives. The model details the key dimensions and actions necessary for addressing the challenge of microgrid failure in remote communities. The study uses remote communities within Nigeria, West Africa, as case studies and demonstrates the need for the STEEP approach for better understanding of microgrid planning and development. Better insights into microgrid systems are expected to address the drawbacks and improve the situation that can lead to widespread and sustainable applications in off-grid communities around the world in the future. The paper introduces the sustainable planning framework (SPF based on the STEEP model, which can form a general basis for planning microgrids in any remote location.

  17. Application distribution model and related security attacks in VANET

    Science.gov (United States)

    Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian

    2013-03-01

    In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.

  18. Human eye modelling for ophthalmic simulators project for clinic applications

    International Nuclear Information System (INIS)

    Sanchez, Andrea; Santos, Adimir dos; Yoriyaz, Helio

    2002-01-01

    Most of eye tumors are treated by surgical means, which involves the enucleation of affected eyes. In terms of treatment and control of diseases, there is brachytherapy, which often utilizes small applicator of Co-60, I-125, Ru-106, Ir-192, etc. These methods are shown to be very efficient but highly cost. The objective of this work is propose a detailed simulator modelling for eye characterization. Additionally, this study can contribute to design and build a new applicator in order to reduce the cost and to allow more patients to be treated

  19. Application of product modelling - seen from a work preparation viewpoint

    DEFF Research Database (Denmark)

    Hvam, Lars

    the specification work. The theoretical fundament of the project include four elements. The first element (work preparation) consider methods for analysing and preparing the direct work in the production, pointing to an analogy between analysing the direct work in the production and the work in the planning systems......, over building a model, and to the final programming of an application. It has been stressed out to carry out all the phases in the outline of procedure in the empirical work, one of the reasons being to prove that it is possible, with a reasonable consumption of resources, to build an application...

  20. Application of online modeling to the operation of SLC

    International Nuclear Information System (INIS)

    Woodley, M.D.; Sanchez-Chopitea, L.; Shoaee, H.

    1987-01-01

    Online computer models of first order beam optics have been developed for the commissioning, control and operation of the entire SLC including Damping Rings, Linac, Positron Return Line and Collider Arcs. A generalized online environment utilizing these models provides the capability for interactive selection of a desire optics configuration and for the study of its properties. Automated procedures have been developed which calculate and load beamline component set-points and which can scale magnet strengths to achieve desired beam properties for any Linac energy profile. Graphic displays facilitate comparison of design, desired and actual optical characteristics of the beamlines. Measured beam properties, such as beam emittance and dispersion, can be incorporated interactively into the models and used for beam matching and optimization of injection and extraction efficiencies and beam transmissions. The online optics modeling facility also serves as the foundation for many model-driven applications such as autosteering, calculation of beam launch parameters, emittance measurement and dispersion correction

  1. Structural Equation Modeling: Theory and Applications in Forest Management

    Directory of Open Access Journals (Sweden)

    Tzeng Yih Lam

    2012-01-01

    Full Text Available Forest ecosystem dynamics are driven by a complex array of simultaneous cause-and-effect relationships. Understanding this complex web requires specialized analytical techniques such as Structural Equation Modeling (SEM. The SEM framework and implementation steps are outlined in this study, and we then demonstrate the technique by application to overstory-understory relationships in mature Douglas-fir forests in the northwestern USA. A SEM model was formulated with (1 a path model representing the effects of successively higher layers of vegetation on late-seral herbs through processes such as light attenuation and (2 a measurement model accounting for measurement errors. The fitted SEM model suggested a direct negative effect of light attenuation on late-seral herbs cover but a direct positive effect of northern aspect. Moreover, many processes have indirect effects mediated through midstory vegetation. SEM is recommended as a forest management tool for designing silvicultural treatments and systems for attaining complex arrays of management objectives.

  2. Application of 3-dimensional CAD modeling system in nuclear plants

    International Nuclear Information System (INIS)

    Suwa, Minoru; Saito, Shunji; Nobuhiro, Minoru

    1990-01-01

    Until now, the preliminary work for mutual components in nuclear plant were readied by using plastic models. Recently with the development of computer graphic techniques, we can display the components on the graphics terminal, better than with use of plastic model and actual plants. The computer model can be handled, both telescopically and microscopically. A computer technique called 3-dimensional CAD modeling system was used as the preliminary work and design system. Through application of this system, database for nuclear plants was completed in arrangement step. The data can be used for piping design, stress analysis, shop production, testing and site construction, in all steps. In addition, the data can be used for various planning works, even after starting operation of plant. This paper describes the outline of the 3-dimensional CAD modeling system. (author)

  3. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  4. Modelling of transport processes in porous media for energy applications

    Energy Technology Data Exchange (ETDEWEB)

    Kangas, M.

    1996-12-31

    Flows in porous media are encountered in many branches of technology. In these phenomena, a fluid of some sort is flowing through porous matrix of a solid medium. Examples of the fluid are water, air, gas and oil. The solid matrix can be soil, fissured rock, ceramics, filter paper, etc. The flow is in many cases accompanied by transfer of heat or solute within the fluid or between the fluid and the surrounding solid matrix. Chemical reactions or microbiological processes may also be taking place in the system. In this thesis, a 3-dimensional computer simulation model THETA for the coupled transport of fluid, heat, and solute in porous media has been developed and applied to various problems in the field of energy research. Although also applicable to porous medium applications in general, the version of the model described and used in this work is intended for studying the transport processes in aquifers, which are geological formations containing groundwater. The model highlights include versatile input and output routines, as well as modularity which, for example, enables an easy adaptation of the model for use as a subroutine in large energy system simulations. Special attention in the model development has been attached to high flow conditions, which may be present in Nordic esker aquifers located close to the ground surface. The simulation model has been written with FORTRAN 77 programming language, enabling a seamless operation both in PC and main frame environments. For PC simulation, a special graphic user interface has been developed. The model has been used with success in a wide variety of applications, ranging from basic thermal analyses to thermal energy storage system evaluations and nuclear waste disposal simulations. The studies have shown that thermal energy storage is feasible also in Nordic high flow aquifers, although at the cost of lower recovery temperature level, usually necessitating the use of heat pumps. In the nuclear waste studies, it

  5. Computational spectrotemporal auditory model with applications to acoustical information processing

    Science.gov (United States)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  6. Overview on available animal models for application in leukemia research

    International Nuclear Information System (INIS)

    Borkhardt, A.; Sanchez-Garcia, I.; Cobaleda, C.; Hauer, J.

    2015-01-01

    The term ''leukemia'' encompasses a group of diseases with a variable clinical and pathological presentation. Its cellular origin, its biology and the underlying molecular genetic alterations determine the very variable and individual disease phenotype. The focus of this review is to discuss the most important guidelines to be taken into account when we aim at developing an ''ideal'' animal model to study leukemia. The animal model should mimic all the clinical, histological and molecular genetic characteristics of the human phenotype and should be applicable as a clinically predictive model. It should achieve all the requirements to be used as a standardized model adaptive to basic research as well as to pharmaceutical practice. Furthermore it should fulfill all the criteria to investigate environmental risk factors, the role of genomic mutations and be applicable for therapeutic testing. These constraints limit the usefulness of some existing animal models, which are however very valuable for basic research. Hence in this review we will primarily focus on genetically engineered mouse models (GEMMs) to study the most frequent types of childhood leukemia. GEMMs are robust models with relatively low site specific variability and which can, with the help of the latest gene modulating tools be adapted to individual clinical and research questions. Moreover they offer the possibility to restrict oncogene expression to a defined target population and regulate its expression level as well as its timely activity. Until recently it was only possible in individual cases to develop a murin model, which fulfills the above mentioned requirements. Hence the development of new regulatory elements to control targeted oncogene expression should be priority. Tightly controlled and cell specific oncogene expression can then be combined with a knock-in approach and will depict a robust murine model, which enables almost physiologic oncogene

  7. High-fidelity geometric modeling for biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zeyun [Univ. of California, San Diego, CA (United States). Dept. of Mathematics; Holst, Michael J. [Univ. of California, San Diego, CA (United States). Dept. of Mathematics; Andrew McCammon, J. [Univ. of California, San Diego, CA (United States). Dept. of Chemistry and Biochemistry; Univ. of California, San Diego, CA (United States). Dept. of Pharmacology

    2008-05-19

    In this paper, we describe a combination of algorithms for high-fidelity geometric modeling and mesh generation. Although our methods and implementations are application-neutral, our primary target application is multiscale biomedical models that range in scales across the molecular, cellular, and organ levels. Our software toolchain implementing these algorithms is general in the sense that it can take as input a molecule in PDB/PQR forms, a 3D scalar volume, or a user-defined triangular surface mesh that may have very low quality. The main goal of our work presented is to generate high quality and smooth surface triangulations from the aforementioned inputs, and to reduce the mesh sizes by mesh coarsening. Tetrahedral meshes are also generated for finite element analysis in biomedical applications. Experiments on a number of bio-structures are demonstrated, showing that our approach possesses several desirable properties: feature-preservation, local adaptivity, high quality, and smoothness (for surface meshes). Finally, the availability of this software toolchain will give researchers in computational biomedicine and other modeling areas access to higher-fidelity geometric models.

  8. Risk Measurement and Risk Modelling Using Applications of Vine Copulas

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2017-09-01

    Full Text Available This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is usually applied to pairs of securities. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using the rich variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. The paper features the use of Regular Vine copulas in an analysis of the co-dependencies of 10 major European Stock Markets, as represented by individual market indices and the composite STOXX 50 index. The sample runs from 2005 to the end of 2013 to permit an exploration of how correlations change indifferent economic circumstances using three different sample periods: pre-GFC (January 2005–July 2007, GFC (July 2007– September 2009, and post-GFC periods (September 2009–December 2013. The empirical results suggest that the dependencies change in a complex manner, and are subject to change in different economic circumstances. One of the attractions of this approach to risk modelling is the flexibility in the choice of distributions used to model co-dependencies. The practical application of Regular Vine metrics is demonstrated via an example of the calculation of the VaR of a portfolio made up of the indices.

  9. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    Science.gov (United States)

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  10. Language Model Applications to Spelling with Brain-Computer Interfaces

    Directory of Open Access Journals (Sweden)

    Anderson Mora-Cortes

    2014-03-01

    Full Text Available Within the Ambient Assisted Living (AAL community, Brain-Computer Interfaces (BCIs have raised great hopes as they provide alternative communication means for persons with disabilities bypassing the need for speech and other motor activities. Although significant advancements have been realized in the last decade, applications of language models (e.g., word prediction, completion have only recently started to appear in BCI systems. The main goal of this article is to review the language model applications that supplement non-invasive BCI-based communication systems by discussing their potential and limitations, and to discern future trends. First, a brief overview of the most prominent BCI spelling systems is given, followed by an in-depth discussion of the language models applied to them. These language models are classified according to their functionality in the context of BCI-based spelling: the static/dynamic nature of the user interface, the use of error correction and predictive spelling, and the potential to improve their classification performance by using language models. To conclude, the review offers an overview of the advantages and challenges when implementing language models in BCI-based communication systems when implemented in conjunction with other AAL technologies.

  11. Language model applications to spelling with Brain-Computer Interfaces.

    Science.gov (United States)

    Mora-Cortes, Anderson; Manyakov, Nikolay V; Chumerin, Nikolay; Van Hulle, Marc M

    2014-03-26

    Within the Ambient Assisted Living (AAL) community, Brain-Computer Interfaces (BCIs) have raised great hopes as they provide alternative communication means for persons with disabilities bypassing the need for speech and other motor activities. Although significant advancements have been realized in the last decade, applications of language models (e.g., word prediction, completion) have only recently started to appear in BCI systems. The main goal of this article is to review the language model applications that supplement non-invasive BCI-based communication systems by discussing their potential and limitations, and to discern future trends. First, a brief overview of the most prominent BCI spelling systems is given, followed by an in-depth discussion of the language models applied to them. These language models are classified according to their functionality in the context of BCI-based spelling: the static/dynamic nature of the user interface, the use of error correction and predictive spelling, and the potential to improve their classification performance by using language models. To conclude, the review offers an overview of the advantages and challenges when implementing language models in BCI-based communication systems when implemented in conjunction with other AAL technologies.

  12. A review of toxicity models for realistic atmospheric applications

    Science.gov (United States)

    Gunatilaka, Ajith; Skvortsov, Alex; Gailis, Ralph

    2014-02-01

    There are many applications that need to study human health effects caused by exposure to toxic chemicals. Risk analysis for industrial sites, study of population health impacts of atmospheric pollutants, and operations research for assessing the potential impacts of chemical releases in military contexts are some examples. Because of safety risks and the high cost of field trials involving hazardous chemical releases, computer simulations are widely used for such studies. Modelling of atmospheric transport and dispersion of chemicals released into the atmosphere to determine the toxic chemical concentrations to which individuals will be exposed is one main component of these simulations, and there are well established atmospheric dispersion models for this purpose. Estimating the human health effects caused by the exposure to these predicted toxic chemical concentrations is the other main component. A number of different toxicity models for assessing the health effects of toxic chemical exposure are found in the literature. Because these different models have been developed based on different assumptions about the plume characteristics, chemical properties, and physiological response, there is a need to review and compare these models to understand their applicability. This paper reviews several toxicity models described in the literature. The paper also presents results of applying different toxicity models to simulated concentration time series data. These results show that the use of ensemble mean concentrations, which are what atmospheric dispersion models typically provide, to estimate human health effects of exposure to hazardous chemical releases may underestimate their impact when toxic exponent, n, of the chemical is greater than one; the opposite phenomenon appears to hold when n biological recovery processes may predict greater toxicity than the explicitly parameterised models. Despite the wide variety of models of varying degrees of complexity that is

  13. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  14. Beginning SQL Server Modeling Model-driven Application Development in SQL Server

    CERN Document Server

    Weller, Bart

    2010-01-01

    Get ready for model-driven application development with SQL Server Modeling! This book covers Microsoft's SQL Server Modeling (formerly known under the code name "Oslo") in detail and contains the information you need to be successful with designing and implementing workflow modeling. Beginning SQL Server Modeling will help you gain a comprehensive understanding of how to apply DSLs and other modeling components in the development of SQL Server implementations. Most importantly, after reading the book and working through the examples, you will have considerable experience using SQL M

  15. Modelling and application of the inactivation of microorganism

    International Nuclear Information System (INIS)

    Oğuzhan, P.; Yangılar, F.

    2013-01-01

    Prevention of consuming contaminated food with toxic microorganisms causing infections and consideration of food protection and new microbial inactivation methods are obligatory situations. Food microbiology is mainly related with unwanted microorganisms spoiling foods during processing and transporting stages and causing diseases. Determination of pathogen microorganisms is important for human health to define and prevent dangers and elongate shelf life. Inactivation of pathogen microorganisms can provide food security and reduce nutrient losses. Microbial inactivation which is using methods of food protection such as food safety and fresh. With this aim, various methods are used such as classical thermal processes (pasteurisation, sterilisation), pressured electrical field (PEF), ionised radiation, high pressure, ultrasonic waves and plasma sterilisation. Microbial inactivation modelling is a secure and effective method in food production. A new microbiological application can give useful results for risk assessment in food, inactivation of microorganisms and improvement of shelf life. Application and control methods should be developed and supported by scientific research and industrial applications

  16. Handbook of EOQ inventory problems stochastic and deterministic models and applications

    CERN Document Server

    Choi, Tsan-Ming

    2013-01-01

    This book explores deterministic and stochastic EOQ-model based problems and applications, presenting technical analyses of single-echelon EOQ model based inventory problems, and applications of the EOQ model for multi-echelon supply chain inventory analysis.

  17. Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling

    Science.gov (United States)

    Huber, I.; Archontoulis, S.

    2017-12-01

    In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar

  18. Knowledge gobernanza model ah its applications in OTRIS. Two cases

    International Nuclear Information System (INIS)

    Bueno Campos, E.; Plaz Landela, R.; Albert Berenguer, J.

    2007-01-01

    The importance of I+D and knowledge transfer in European economies and in Technology and Innovation in Spain, is a key issue to achieve the Europe target for 2010 to become the European society of knowledge for growth. This article shows, with certain detail, the structure and functions of the MTT model used as a reference of processes needed to fulfil the mission of an OTRI and its function of knowledge transfer.Two concrete applications show the effectiveness and functionality of the model; CARTA and PRISMA applications are case studies of the MTT implementation process. They represent a first step through new developments that are being carried out in other OTRIS. (Author) 35 refs

  19. Instructional Storytelling: Application of the Clinical Judgment Model in Nursing.

    Science.gov (United States)

    Timbrell, Jessica

    2017-05-01

    Little is known about the teaching and learning implications of instructional storytelling (IST) in nursing education or its potential connection to nursing theory. The literature establishes storytelling as a powerful teaching-learning method in the educational, business, humanities, and health sectors, but little exploration exists that is specific to nursing. An example of a story demonstrating application of the domains of Tanner's clinical judgment model links storytelling with learning outcomes appropriate for the novice nursing student. Application of Tanner's clinical judgment model offers consistency of learning experience while preserving the creativity inherent in IST. Further research into student learning outcomes achievement using IST is warranted as a step toward establishing best practices with IST in nursing education. [J Nurs Educ. 2017;56(5):305-308.]. Copyright 2017, SLACK Incorporated.

  20. Powder consolidation using cold spray process modeling and emerging applications

    CERN Document Server

    Moridi, Atieh

    2017-01-01

    This book first presents different approaches to modeling of the cold spray process with the aim of extending current understanding of its fundamental principles and then describes emerging applications of cold spray. In the coverage of modeling, careful attention is devoted to the assessment of critical and erosion velocities. In order to reveal the phenomenological characteristics of interface bonding, severe, localized plastic deformation and material jet formation are studied. Detailed consideration is also given to the effect of macroscopic defects such as interparticle boundaries and subsequent splat boundary cracking on the mechanical behavior of cold spray coatings. The discussion of applications focuses in particular on the repair of damaged parts and additive manufacturing in various disciplines from aerospace to biomedical engineering. Key aspects include a systematic study of defect shape and the ability of cold spray to fill the defect, examination of the fatigue behavior of coatings for structur...

  1. Virtual 3d City Modeling: Techniques and Applications

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3

  2. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  3. Dependencies between models in the model-driven design of distributed applications

    NARCIS (Netherlands)

    Andrade Almeida, João; Bevinoppa, S.; Ferreira Pires, Luis; van Sinderen, Marten J.; Hammoudi, S.

    2005-01-01

    In our previous work, we have defined a model-driven design approach based on the organization of models of a distributed application according to different levels of platform-independence. In our approach, the design process is structured into a preparation and an execution phase. In the

  4. The DO ART Model: An Ethical Decision-Making Model Applicable to Art Therapy

    Science.gov (United States)

    Hauck, Jessica; Ling, Thomson

    2016-01-01

    Although art therapists have discussed the importance of taking a positive stance in terms of ethical decision making (Hinz, 2011), an ethical decision-making model applicable for the field of art therapy has yet to emerge. As the field of art therapy continues to grow, an accessible, theoretically grounded, and logical decision-making model is…

  5. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  6. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  7. Systems Engineering Model and Training Application for Desktop Environment

    Science.gov (United States)

    May, Jeffrey T.

    2010-01-01

    Provide a graphical user interface based simulator for desktop training, operations and procedure development and system reference. This simulator allows for engineers to train and further understand the dynamics of their system from their local desktops. It allows the users to train and evaluate their system at a pace and skill level based on the user's competency and from a perspective based on the user's need. The simulator will not require any special resources to execute and should generally be available for use. The interface is based on a concept of presenting the model of the system in ways that best suits the user's application or training needs. The three levels of views are Component View, the System View (overall system), and the Console View (monitor). These views are portals into a single model, so changing the model from one view or from a model manager Graphical User Interface will be reflected on all other views.

  8. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  9. Overview of the EPRI CONTRACTMIX model for natural gas applications

    International Nuclear Information System (INIS)

    1993-09-01

    The Contract Mix Model (CONTRACTMIX) is designed to assist gas supply planners in analyzing the costs and risks of alternative supply strategies. By explicitly incorporating uncertainty about gas demand and market conditions into the analysis, the methodology permits the analyst to compare contracting strategies on the basis of cost and risk and to assess the value of flexible strategies and contracts. The model is applicable to purchase decisions for natural gas and other fuels. CONTRACTMIX may be used at all phases of supply decision-making, from broad strategy formulation to detailed contract design and evaluation. This document introduces the prospective user to the model's capability for analysis of gas supply contracting decisions. The document describes the types of problems CONTRACTMIX is designed to address as well as the model's structure, inputs, outputs, and unique features

  10. A sample application of nuclear power human resources model

    International Nuclear Information System (INIS)

    Gurgen, A.; Ergun, S.

    2016-01-01

    One of the most important issues for a new comer country initializing the nuclear power plant projects is to have both quantitative and qualitative models for the human resources development. For the quantitative model of human resources development for Turkey, “Nuclear Power Human Resources (NPHR) Model” developed by the Los Alamos National Laboratory was used to determine the number of people that will be required from different professional or occupational fields in the planning of human resources for Akkuyu, Sinop and the third nuclear power plant projects. The number of people required for different professions for the Nuclear Energy Project Implementation Department, the regulatory authority, project companies, construction, nuclear power plants and the academy were calculated. In this study, a sample application of the human resources model is presented. The results of the first tries to calculate the human resources needs of Turkey were obtained. Keywords: Human Resources Development, New Comer Country, NPHR Model

  11. Application of PSO based ann model for STLF

    International Nuclear Information System (INIS)

    Hassnain, S.R.U.; Asar, A.U.; Khan, A.

    2008-01-01

    This paper presents a new approach for modeling STLF (Short Term Load Forecasting) in which STLF-ANN forecaster is trained using swarm intelligence. ANN (Artificial Neural Network) has been used successfully for STLF. However, ANN-based STLF models use BP (Backward Propagation) algorithm for training which does not ensure convergence and hangs in local optima more often. Moreover, BP requires much longer time for training which makes it difficult for real-time application. In this paper, we propose smaller ANN models of STLF based on hourly load data and train it through the use of PSO (Particle Swarm Optimization) Algorithm. The approach gives better trained models capable of performing well over time varying window and results in fairly accurate forecasts. (author)

  12. Ionocovalency and Applications 1. Ionocovalency Model and Orbital Hybrid Scales

    Directory of Open Access Journals (Sweden)

    Yonghe Zhang

    2010-11-01

    Full Text Available Ionocovalency (IC, a quantitative dual nature of the atom, is defined and correlated with quantum-mechanical potential to describe quantitatively the dual properties of the bond. Orbiotal hybrid IC model scale, IC, and IC electronegativity scale, XIC, are proposed, wherein the ionicity and the covalent radius are determined by spectroscopy. Being composed of the ionic function I and the covalent function C, the model describes quantitatively the dual properties of bond strengths, charge density and ionic potential. Based on the atomic electron configuration and the various quantum-mechanical built-up dual parameters, the model formed a Dual Method of the multiple-functional prediction, which has much more versatile and exceptional applications than traditional electronegativity scales and molecular properties. Hydrogen has unconventional values of IC and XIC, lower than that of boron. The IC model can agree fairly well with the data of bond properties and satisfactorily explain chemical observations of elements throughout the Periodic Table.

  13. Mathematical modeling for surface hardness in investment casting applications

    International Nuclear Information System (INIS)

    Singh, Rupinder

    2012-01-01

    Investment casting (IC) has many potential engineering applications. Not much work hitherto has been reported for modeling the surface hardness (SH) in IC of industrial components. In the present study, outcome of Taguchi based macro model has been used for developing a mathematical model for SH; using Buckingham's π theorem. Three input parameters namely volume/surface area (V/A) ratio of cast components, slurry layer's combination (LC) and molten metal pouring temperature were selected to give output in form of SH. This study will provide main effects of these variables on SH and will shed light on the SH mechanism in IC. The comparison with experimental results will also serve as further validation of model

  14. Quantummechanical multi-step direct models for nuclear data applications

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-10-01

    Various multi-step direct models have been derived and compared on a theoretical level. Subsequently, these models have been implemented in the computer code system KAPSIES, enabling a consistent comparison on the basis of the same set of nuclear parameters and same set of numerical techniques. Continuum cross sections in the energy region between 10 and several hundreds of MeV have successfully been analysed. Both angular distributions and energy spectra can be predicted in an essentially parameter-free manner. It is demonstrated that the quantum-mechanical MSD models (in particular the FKK model) give an improved prediction of pre-equilibrium angular distributions as compared to the experiment-based systematics of Kalbach. This makes KAPSIES a reliable tool for nuclear data applications in the afore-mentioned energy region. (author). 10 refs., 2 figs

  15. Mathematical modelling of anaerobic digestion processes: applications and future needs

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Puyol, Daniel; Flores Alsina, Xavier

    2015-01-01

    of the role of the central carbon catabolic metabolism in anaerobic digestion, with an increased importance of phosphorous, sulfur, and metals as electron source and sink, and consideration of hydrogen and methane as potential electron sources. The paradigm of anaerobic digestion is challenged by anoxygenic...... phototrophism, where energy is relatively cheap, but electron transfer is expensive. These new processes are commonly not compatible with the existing structure of anaerobic digestion models. These core issues extend to application of anaerobic digestion in domestic plant-wide modelling, with the need......Anaerobic process modelling is a mature and well-established field, largely guided by a mechanistic model structure that is defined by our understanding of underlying processes. This led to publication of the IWA ADM1, and strong supporting, analytical, and extension research in the 15 years since...

  16. Force modeling for incision surgery into tissue with haptic application

    Science.gov (United States)

    Kim, Pyunghwa; Kim, Soomin; Choi, Seung-Hyun; Oh, Jong-Seok; Choi, Seung-Bok

    2015-04-01

    This paper presents a novel force modeling for an incision surgery into tissue and its haptic application for a surgeon. During the robot-assisted incision surgery, it is highly urgent to develop the haptic system for realizing sense of touch in the surgical area because surgeons cannot sense sensations. To achieve this goal, the force modeling related to reaction force of biological tissue is proposed in the perspective on energy. The force model describes reaction force focused on the elastic feature of tissue during the incision surgery. Furthermore, the force is realized using calculated information from the model by haptic device using magnetorheological fluid (MRF). The performance of realized force that is controlled by PID controller with open loop control is evaluated.

  17. Anatomical models for space radiation applications: an overview.

    Science.gov (United States)

    Atwell, W

    1994-10-01

    Extremely detailed computerized anatomical male (CAM) and female (CAF) models that have been developed for use in space radiation analyses are discussed and reviewed. Recognizing that the level of detail may currently be inadequate for certain radiological applications, one of the purposes of this paper is to elicit specific model improvements or requirements from the scientific user-community. Methods and rationale are presented which describe the approach used in the Space Shuttle program to extrapolate dosimetry measurements (skin doses) to realistic astronaut body organ doses. Several mission scenarios are presented which demonstrate the utility of the anatomical models for obtaining specific body organ exposure estimates and can be used for establishing cancer morbidity and mortality risk assessments. These exposure estimates are based on the trapped Van Allen belt and galactic cosmic radiation environment models and data from the major historical solar particle events.

  18. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  19. Antibody Modeling and Structure Analysis. Application to biomedical problems.

    OpenAIRE

    Chailyan, Anna

    2013-01-01

    Background The usefulness of antibodies and antibody derived artificial constructs in various medical and biochemical applications has made them a prime target for protein engineering, modelling, and structure analysis. The huge number of known antibody sequences, that far outpaces the number of solved structures, raises the need for reliable automatic methods of antibody structure prediction. Antibodies have a very characteristic molecular structure that is reflected in their modelli...

  20. Ozone modeling within plasmas for ozone sensor applications

    OpenAIRE

    Arshak, Khalil; Forde, Edward; Guiney, Ivor

    2007-01-01

    peer-reviewed Ozone (03) is potentially hazardous to human health and accurate prediction and measurement of this gas is essential in addressing its associated health risks. This paper presents theory to predict the levels of ozone concentration emittedfrom a dielectric barrier discharge (DBD) plasma for ozone sensing applications. This is done by postulating the kinetic model for ozone generation, with a DBD plasma at atmospheric pressure in air, in the form of a set of rate equations....

  1. An overview of recent applications of computational modelling in neonatology

    Science.gov (United States)

    Wrobel, Luiz C.; Ginalski, Maciej K.; Nowak, Andrzej J.; Ingham, Derek B.; Fic, Anna M.

    2010-01-01

    This paper reviews some of our recent applications of computational fluid dynamics (CFD) to model heat and mass transfer problems in neonatology and investigates the major heat and mass-transfer mechanisms taking place in medical devices, such as incubators, radiant warmers and oxygen hoods. It is shown that CFD simulations are very flexible tools that can take into account all modes of heat transfer in assisting neonatal care and improving the design of medical devices. PMID:20439275

  2. An Application of a Mathematical Blood Flow Model

    Science.gov (United States)

    2001-07-01

    resolution of the heat transfer processes in the body. It should be applicable to different size neonates whereby aspects like the anatomy and the...Bufl3mann, A model for the thermoregulation of premature infants and neonates under consideration of the thermal maturity, PhD Thesis, Medical...34 (1992), 1010-1014. 9. E.C. Mallard et al., Neuronal damage in the developing brain following intrauterine asphyxia , in Reprod. Fertil. Dev. 7

  3. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  4. Large area application of a corn hazard model. [Soviet Union

    Science.gov (United States)

    Ashburn, P.; Taylor, T. W. (Principal Investigator)

    1981-01-01

    An application test of the crop calendar portion of a corn (maize) stress indicator model developed by the early warning, crop condition assessment component of AgRISTARS was performed over the corn for grain producing regions of the U.S.S.R. during the 1980 crop year using real data. Performance of the crop calendar submodel was favorable; efficiency gains in meteorological data analysis time were on a magnitude of 85 to 90 percent.

  5. Modelling in waters geochemistry. Concepts and applications in environment

    International Nuclear Information System (INIS)

    Windt, L. de; Lee, J.V.D.; Schmitt, J.M.

    2005-01-01

    The aim of this work is to give the main point of the physico-chemical concepts and of the mathematical laws on which are based the geochemical modelling of waters, while presenting concrete and typical applications examples to the problems of environment and of water resources management. In a table (Doc. AF 6530) are gathered the distribution sources of softwares and of thermodynamic data banks. (O.M.)

  6. Sensors advancements in modeling, design issues, fabrication and practical applications

    CERN Document Server

    Mukhopadhyay, Subhash Chandra

    2008-01-01

    Sensors are the most important component in any system and engineers in any field need to understand the fundamentals of how these components work, how to select them properly and how to integrate them into an overall system. This book has outlined the fundamentals, analytical concepts, modelling and design issues, technical details and practical applications of different types of sensors, electromagnetic, capacitive, ultrasonic, vision, Terahertz, displacement, fibre-optic and so on. The book: addresses the identification, modeling, selection, operation and integration of a wide variety of se

  7. Modelling of Electrokinetic Processes in Civil and Environmental Engineering Applications

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2011-01-01

    A mathematical model for the electrokinetic phenomena is described. Numerical simulations of different applications of electrokinetic techniques to the fields of civil and environmental engineering are included, showing the versatility and consistency of the model. The electrokinetics phenomena c...... to hinder the acid penetration; and an acid-enhanced electrokinetic soil remediation process, where the basic front is neutralized in order to avoid the precipitation of hydroxides of the target heavy metal....... the porous materials undergoes an electroosmotic flow subject to externally applied electric fields. Electroosmotic transport makes electrokinetic techniques suitable for the mobilization of non-charged particles within the pore structure, such as the organic contaminants in soil. Chemical equilibrium...

  8. Models and applications of chaos theory in modern sciences

    CERN Document Server

    Zeraoulia, Elhadj

    2011-01-01

    This book presents a select group of papers that provide a comprehensive view of the models and applications of chaos theory in medicine, biology, ecology, economy, electronics, mechanical, and the human sciences. Covering both the experimental and theoretical aspects of the subject, it examines a range of current topics of interest. It considers the problems arising in the study of discrete and continuous time chaotic dynamical systems modeling the several phenomena in nature and society-highlighting powerful techniques being developed to meet these challenges that stem from the area of nonli

  9. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    Science.gov (United States)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream

  10. 2D modelling and its applications in engineering

    International Nuclear Information System (INIS)

    Altinbalik, M. Tahir; İRSEL, Gürkan

    2013-01-01

    A model, in computer aided engineering applications, may be created by either using a two- dimensional or a three-dimensional design depending on the purpose of design. What matters most in this regard is the selection of a right method to meet system solution requirements in the most economical way. Manufacturability of a design that is developed by utilising computer aided engineering is important, but usability of the data obtained in the course of design works in the production is also equally important. In the applications consisting of such production operations as CNC or plasma cutting, two-dimensional designs can be directly used in production. These machines are equipped with interfaces which converts two-dimensional drawings into codes. In this way, a design can be directly transferred to production, and any arrangements during production process can be synchronously evaluated. As a result of this, investment expenses will be lowered, and thus the costs can be reduced to some extent. In the presented study, we have studied two-dimensional design applications and requirements. We created a two-dimensional design for a part for which a three-dimensional model have previously been generated, and then, we transferred this design to plasma cutting machine, and thus, the operation has been realized experimentally. Key words: Plasma Cutting, 2D modelling, flexibility

  11. Applications of the k – ω Model in Stellar Evolutionary Models

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan, E-mail: ly@ynao.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650216 (China)

    2017-05-20

    The k – ω model for turbulence was first proposed by Kolmogorov. A new k – ω model for stellar convection was developed by Li, which could reasonably describe turbulent convection not only in the convectively unstable zone, but also in the overshooting regions. We revised the k – ω model by improving several model assumptions (including the macro-length of turbulence, convective heat flux, and turbulent mixing diffusivity, etc.), making it applicable not only for convective envelopes, but also for convective cores. Eight parameters are introduced in the revised k – ω model. It should be noted that the Reynolds stress (turbulent pressure) is neglected in the equation of hydrostatic support. We applied it into solar models and 5 M {sub ⊙} stellar models to calibrate the eight model parameters, as well as to investigate the effects of the convective overshooting on the Sun and intermediate mass stellar models.

  12. Optimizing a gap conductance model applicable to VVER-1000 thermal–hydraulic model

    International Nuclear Information System (INIS)

    Rahgoshay, M.; Hashemi-Tilehnoee, M.

    2012-01-01

    Highlights: ► Two known conductance models for application in VVER-1000 thermal–hydraulic code are examined. ► An optimized gap conductance model is developed which can predict the gap conductance in good agreement with FSAR data. ► The licensed thermal–hydraulic code is coupled with the gap conductance model predictor externally. -- Abstract: The modeling of gap conductance for application in VVER-1000 thermal–hydraulic codes is addressed. Two known models, namely CALZA-BINI and RELAP5 gap conductance models, are examined. By externally linking of gap conductance models and COBRA-EN thermal hydraulic code, the acceptable range of each model is specified. The result of each gap conductance model versus linear heat rate has been compared with FSAR data. A linear heat rate of about 9 kW/m is the boundary for optimization process. Since each gap conductance model has its advantages and limitation, the optimized gap conductance model can predict the gap conductance better than each of the two other models individually.

  13. Reliability models applicable to space telescope solar array assembly system

    Science.gov (United States)

    Patil, S. A.

    1986-01-01

    A complex system may consist of a number of subsystems with several components in series, parallel, or combination of both series and parallel. In order to predict how well the system will perform, it is necessary to know the reliabilities of the subsystems and the reliability of the whole system. The objective of the present study is to develop mathematical models of the reliability which are applicable to complex systems. The models are determined by assuming k failures out of n components in a subsystem. By taking k = 1 and k = n, these models reduce to parallel and series models; hence, the models can be specialized to parallel, series combination systems. The models are developed by assuming the failure rates of the components as functions of time and as such, can be applied to processes with or without aging effects. The reliability models are further specialized to Space Telescope Solar Arrray (STSA) System. The STSA consists of 20 identical solar panel assemblies (SPA's). The reliabilities of the SPA's are determined by the reliabilities of solar cell strings, interconnects, and diodes. The estimates of the reliability of the system for one to five years are calculated by using the reliability estimates of solar cells and interconnects given n ESA documents. Aging effects in relation to breaks in interconnects are discussed.

  14. AGR core models and their application to HTRs and RBMKs

    International Nuclear Information System (INIS)

    Baylis, Samuel

    2014-01-01

    EDF Energy operates 14 AGRs, commissioned between 1976 and 1989. The graphite moderators of these gas cooled reactors are subjected to a number of ageing processes under fast neutron irradiation in a high temperature CO2 environment. As the graphite ages, continued safe operation requires an advanced whole-core modeling capability to enable accurate assessments of the core’s ability to fulfil fundamental nuclear safety requirements. This is also essential in evaluating the reactor's remaining economic lifetime, and similar assessments are useful for HTRs in the design stage. A number of computational and physical models of AGR graphite cores have been developed or are in development, allowing simulation of the reactors in normal, fault and seismic conditions. Many of the techniques developed are applicable to other graphite moderated reactors. Modeling of the RBMK allows validation against a core in a more advanced state of ageing than the AGRs, while there is also an opportunity to adapt the models for high temperature reactors. As an example, a finite element model of the HTR-PM side reflector based on rigid bodies and nonlinear springs is developed, allowing rapid assessments of distortion in the structure to be made. A model of the RBMK moderator has also been produced using an established AGR code based on similar methods. In addition, this paper discusses the limitations of these techniques and the development of more complex core models that address these limitations, along with the lessons that can be applied to HTRs. (author)

  15. Case studies of computer model applications in consulting practice

    Science.gov (United States)

    Siebein, Gary; Paek, Hyun; Lorang, Mark; McGuinnes, Courtney

    2002-05-01

    Six case studies of computer model applications in a consulting practice will be presented to present the range of issues that can be studied with computer models as well as to understand the limitations of the technique at the present time. Case studies of elliptical conference rooms demonstrate basic acoustic ray principles and suggest remediation strategies. Models of a large themed entertainment venue with multiple amplified sound sources show how visualization of the acoustic ray paths can assist a consultant and client in value engineering locations and amounts of acoustic materials. The acoustic problems with an angled ceiling and large rear wall were studied when an historic church was converted to a music performance hall. The computer model of an historic hall did not present enough detailed information and was supplemented with physical model studies and full size mock-up tests of the insertion of an elevator door that would open directly into the concert room. Studies to demonstrate the amount of room model detail to obtain realistic auralizations were also conducted. The integration of architectural acoustic design and audio system design were studied in computer models of a large church sanctuary.

  16. New Trends in Model Coupling Theory, Numerics and Applications

    International Nuclear Information System (INIS)

    Coquel, F.; Godlewski, E.; Herard, J. M.; Segre, J.

    2010-01-01

    This special issue comprises selected papers from the workshop New Trends in Model Coupling, Theory, Numerics and Applications (NTMC'09) which took place in Paris, September 2 - 4, 2009. The research of optimal technological solutions in a large amount of industrial systems requires to perform numerical simulations of complex phenomena which are often characterized by the coupling of models related to various space and/or time scales. Thus, the so-called multi-scale modelling has been a thriving scientific activity which connects applied mathematics and other disciplines such as physics, chemistry, biology or even social sciences. To illustrate the variety of fields concerned by the natural occurrence of model coupling we may quote: meteorology where it is required to take into account several turbulence scales or the interaction between oceans and atmosphere, but also regional models in a global description, solid mechanics where a thorough understanding of complex phenomena such as propagation of cracks needs to couple various models from the atomistic level to the macroscopic level; plasma physics for fusion energy for instance where dense plasmas and collisionless plasma coexist; multiphase fluid dynamics when several types of flow corresponding to several types of models are present simultaneously in complex circuits; social behaviour analysis with interaction between individual actions and collective behaviour. (authors)

  17. Thermoregulatory modeling use and application in the military workforce.

    Science.gov (United States)

    Yokota, Miyo; Berglund, Larry G; Xu, Xiaojiang

    2014-05-01

    Thermoregulatory models have been used in the military to quantify probabilities of individuals' thermal-related illness/injury. The uses of the models have diversified over the past decade. This paper revisits an overall view of selected thermoregulatory models used in the U.S. military and provides examples of actual practical military applications: 1) the latest military vehicle designed with armor and blast/bulletproof windows was assessed to predict crews' thermal strains levels inside vehicles under hot environment (air temperature [Ta]: 29-43 °C, dew point: 13 °C); 2) a military working dog (MWD) model was developed by modifying existing human thermoregulatory models with canine physical appearance and physiological mechanisms; 3) thermal tolerance range of individuals from a large military group (n = 100) exposed to 35 °C/40% relative humidity were examined using thermoregulatory modeling and multivariate statistical analyses. Model simulation results assist in the decisions for the strategic planning and preventions of heat stress. Published by Elsevier Ltd.

  18. Equivalent-Continuum Modeling With Application to Carbon Nanotubes

    Science.gov (United States)

    Odegard, Gregory M.; Gates, Thomas S.; Nicholson, Lee M.; Wise, Kristopher E.

    2002-01-01

    A method has been proposed for developing structure-property relationships of nano-structured materials. This method serves as a link between computational chemistry and solid mechanics by substituting discrete molecular structures with equivalent-continuum models. It has been shown that this substitution may be accomplished by equating the vibrational potential energy of a nano-structured material with the strain energy of representative truss and continuum models. As important examples with direct application to the development and characterization of single-walled carbon nanotubes and the design of nanotube-based devices, the modeling technique has been applied to determine the effective-continuum geometry and bending rigidity of a graphene sheet. A representative volume element of the chemical structure of graphene has been substituted with equivalent-truss and equivalent continuum models. As a result, an effective thickness of the continuum model has been determined. This effective thickness has been shown to be significantly larger than the interatomic spacing of graphite. The effective thickness has been shown to be significantly larger than the inter-planar spacing of graphite. The effective bending rigidity of the equivalent-continuum model of a graphene sheet was determined by equating the vibrational potential energy of the molecular model of a graphene sheet subjected to cylindrical bending with the strain energy of an equivalent continuum plate subjected to cylindrical bending.

  19. Functional dynamic factor models with application to yield curve forecasting

    KAUST Repository

    Hays, Spencer

    2012-09-01

    Accurate forecasting of zero coupon bond yields for a continuum of maturities is paramount to bond portfolio management and derivative security pricing. Yet a universal model for yield curve forecasting has been elusive, and prior attempts often resulted in a trade-off between goodness of fit and consistency with economic theory. To address this, herein we propose a novel formulation which connects the dynamic factor model (DFM) framework with concepts from functional data analysis: a DFM with functional factor loading curves. This results in a model capable of forecasting functional time series. Further, in the yield curve context we show that the model retains economic interpretation. Model estimation is achieved through an expectation- maximization algorithm, where the time series parameters and factor loading curves are simultaneously estimated in a single step. Efficient computing is implemented and a data-driven smoothing parameter is nicely incorporated. We show that our model performs very well on forecasting actual yield data compared with existing approaches, especially in regard to profit-based assessment for an innovative trading exercise. We further illustrate the viability of our model to applications outside of yield forecasting.

  20. DSC, FT-IR, NIR, NIR-PCA and NIR-ANOVA for determination of chemical stability of diuretic drugs: impact of excipients

    Directory of Open Access Journals (Sweden)

    Gumieniczek Anna

    2018-03-01

    Full Text Available It is well known that drugs can directly react with excipients. In addition, excipients can be a source of impurities that either directly react with drugs or catalyze their degradation. Thus, binary mixtures of three diuretics, torasemide, furosemide and amiloride with different excipients, i.e. citric acid anhydrous, povidone K25 (PVP, magnesium stearate (Mg stearate, lactose, D-mannitol, glycine, calcium hydrogen phosphate anhydrous (CaHPO4 and starch, were examined to detect interactions. High temperature and humidity or UV/VIS irradiation were applied as stressing conditions. Differential scanning calorimetry (DSC, FT-IR and NIR were used to adequately collect information. In addition, chemometric assessments of NIR signals with principal component analysis (PCA and ANOVA were applied.

  1. The use of the barbell cluster ANOVA design for the assessment of environmental pollution: a case study, Wigierski National Park, NE Poland.

    Science.gov (United States)

    Migaszewski, Zdzisław M; Gałuszka, Agnieszka; Pasławski, Piotr

    2005-01-01

    This report presents an assessment of chemical variability in natural ecosystems of Wigierski National Park (NE Poland) derived from the calculation of geochemical baselines using a barbell cluster ANOVA design. This method enabled us to obtain statistically valid information with a minimum number of samples collected. Results of summary statistics are presented for elemental concentrations in the soil horizons-O (Ol + Ofh), -A and -B, 1- and 2-year old Pinus sylvestris L. (Scots pine) needles, pine bark and Hypogymnia physodes (L.) Nyl. (lichen) thalli, as well as pH and TOC. The scope of this study also encompassed S and C stable isotope determinations and SEM examinations on Scots pine needles. The variability for S and trace metals in soils and plant bioindicators is primarily governed by parent material lithology and to a lesser extent by anthropogenic factors. This fact enabled us to study concentrations that are close to regional background levels.

  2. Modeling dermal exposure--an illustration for spray painting applications.

    Science.gov (United States)

    Flynn, Michael R; Koto, Yoshi; Fent, Kenneth; Nylander-French, Leena A

    2006-09-01

    This article presents a conceptual, mathematical model of dermal exposure resulting from aerosol deposition on human forearm hair. The model is applicable to exposure scenarios where dermal deposition is governed by aerosol impaction, interception, and diffusion mechanisms. The model employs filtration theory, single fiber efficiency equations, and a modified potential airflow approximation. The results are extended, using previously published results, for application to dermal deposition on the forearm during spray painting. The average (N = 8) predicted dermal deposition of 1,6-hexamethylene diisocyanate as collected on a 10-cm(2) tape strip is 108.9 (+/- 70.3) pmol, whereas field measurements indicated an average of 168.6 (+/- 82.0) pmol per strip. The corresponding measured average dermal flux was 3.63 pg/cm(2)s (+/- 1.34); the prediction was 2.24 pg/cm(2)sec (+/- 1.25). The study calls attention to the importance of body hair both for modeling and measuring dermal exposures.

  3. Application of model based control to robotic manipulators

    Science.gov (United States)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  4. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  5. Application of a leakage model to assess exfiltration from sewers.

    Science.gov (United States)

    Karpf, C; Krebs, P

    2005-01-01

    The exfiltration of wastewater from sewer systems in urban areas causes a deterioration of soil and possibly groundwater quality. Beside the simulation of transport and degradation processes in the unsaturated zone and in the aquifer the analysis of the potential impact requires the estimation of quantity and temporal variation of wastewater exfiltration. Exfiltration can be assessed by the application of a leakage model. The hydrological approach was originally developed to simulate the interactions between the groundwater and surface water, it was adapted to allow for modelling of interactions between groundwater and sewer system. In order to approximate the exfiltration specific model parameters infiltration specific parameters were used as a basis. Scenario analysis of the exfiltration in the City of Dresden from 1997 to 1999 and during the flood event in August 2002 shows the variation and the extent of exfiltration rates.

  6. Application of Z-Number Based Modeling in Psychological Research

    Directory of Open Access Journals (Sweden)

    Rafik Aliev

    2015-01-01

    Full Text Available Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS, Test of Attention (D2 Test, and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  7. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  8. Clinical application of the five-factor model.

    Science.gov (United States)

    Widiger, Thomas A; Presnall, Jennifer Ruth

    2013-12-01

    The Five-Factor Model (FFM) has become the predominant dimensional model of general personality structure. The purpose of this paper is to suggest a clinical application. A substantial body of research indicates that the personality disorders included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM) can be understood as extreme and/or maladaptive variants of the FFM (the acronym "DSM" refers to any particular edition of the APA DSM). In addition, the current proposal for the forthcoming fifth edition of the DSM (i.e., DSM-5) is shifting closely toward an FFM dimensional trait model of personality disorder. Advantages of this shifting conceptualization are discussed, including treatment planning. © 2012 Wiley Periodicals, Inc.

  9. Applications of modeling in polymer-property prediction

    Science.gov (United States)

    Case, F. H.

    1996-08-01

    A number of molecular modeling techniques have been applied for the prediction of polymer properties and behavior. Five examples illustrate the range of methodologies used. A simple atomistic simulation of small polymer fragments is used to estimate drug compatibility with a polymer matrix. The analysis of molecular dynamics results from a more complex model of a swollen hydrogel system is used to study gas diffusion in contact lenses. Statistical mechanics are used to predict conformation dependent properties — an example is the prediction of liquid-crystal formation. The effect of the molecular weight distribution on phase separation in polyalkanes is predicted using thermodynamic models. In some cases, the properties of interest cannot be directly predicted using simulation methods or polymer theory. Correlation methods may be used to bridge the gap between molecular structure and macroscopic properties. The final example shows how connectivity-indices-based quantitative structure-property relationships were used to predict properties for candidate polyimids in an electronics application.

  10. A review of visual MODFLOW applications in groundwater modelling

    Science.gov (United States)

    Hariharan, V.; Shankar, M. Uma

    2017-11-01

    Visual MODLOW is a Graphical User Interface for the USGS MODFLOW. It is a commercial software that is popular among the hydrogeologists for its user-friendly features. The software is mainly used for Groundwater flow and contaminant transport models under different conditions. This article is intended to review the versatility of its applications in groundwater modelling for the last 22 years. Agriculture, airfields, constructed wetlands, climate change, drought studies, Environmental Impact Assessment (EIA), landfills, mining operations, river and flood plain monitoring, salt water intrusion, soil profile surveys, watershed analyses, etc., are the areas where the software has been reportedly used till the current date. The review will provide a clarity on the scope of the software in groundwater modelling and research.

  11. A model of moral identity: applications for education.

    Science.gov (United States)

    Matsuba, M Kyle; Murzyn, Theresa; Hart, Daniel

    2011-01-01

    The purpose of this chapter is to build an intellectual bridge between moral psychology and education. Our hope is that the findings from moral psychology will inform and explain best practices in moral education. With that end in mind, we briefly and selectively review the moral education and character education literature highlighting some of the challenges these domains have faced. Next, we review the moral identity literature and offer our own model of moral identity formation emphasizing the "characteristic adaptations" (i.e., moral orientation, moral self, moral emotions, and social relationships and opportunities) of the model. Finally, we illustrate and explain how some of these "characteristic adaptations" have been or could be used in the development of successful moral education programs, and provide specific examples for application of our model in the domain of sex education.

  12. On the application of copula in modeling maintenance contract

    Science.gov (United States)

    Iskandar, B. P.; Husniah, H.

    2016-02-01

    This paper deals with the application of copula in maintenance contracts for a nonrepayable item. Failures of the item are modeled using a two dimensional approach where age and usage of the item and this requires a bi-variate distribution to modelling failures. When the item fails then corrective maintenance (CM) is minimally repaired. CM can be outsourced to an external agent or done in house. The decision problem for the owner is to find the maximum total profit whilst for the agent is to determine the optimal price of the contract. We obtain the mathematical models of the decision problems for the owner as well as the agent using a Nash game theory formulation.

  13. Explicit Nonlinear Model Predictive Control Theory and Applications

    CERN Document Server

    Grancharova, Alexandra

    2012-01-01

    Nonlinear Model Predictive Control (NMPC) has become the accepted methodology to solve complex control problems related to process industries. The main motivation behind explicit NMPC is that an explicit state feedback law avoids the need for executing a numerical optimization algorithm in real time. The benefits of an explicit solution, in addition to the efficient on-line computations, include also verifiability of the implementation and the possibility to design embedded control systems with low software and hardware complexity. This book considers the multi-parametric Nonlinear Programming (mp-NLP) approaches to explicit approximate NMPC of constrained nonlinear systems, developed by the authors, as well as their applications to various NMPC problem formulations and several case studies. The following types of nonlinear systems are considered, resulting in different NMPC problem formulations: Ø  Nonlinear systems described by first-principles models and nonlinear systems described by black-box models; �...

  14. The Logistic Maturity Model: Application to a Fashion Company

    Directory of Open Access Journals (Sweden)

    Claudia Battista

    2013-08-01

    Full Text Available This paper describes the structure of the logistic maturity model (LMM in detail and shows the possible improvements that can be achieved by using this model in terms of the identification of the most appropriate actions to be taken in order to increase the performance of the logistics processes in industrial companies. The paper also gives an example of the LMM’s application to a famous Italian female fashion firm, which decided to use the model as a guideline for the optimization of its supply chain. Relying on a 5-level maturity staircase, specific achievement indicators as well as key performance indicators and best practices are defined and related to each logistics area/process/sub-process, allowing any user to easily and rapidly understand the more critical logistical issues in terms of process immaturity.

  15. On the application of copula in modeling maintenance contract

    International Nuclear Information System (INIS)

    Iskandar, B P; Husniah, H

    2016-01-01

    This paper deals with the application of copula in maintenance contracts for a nonrepayable item. Failures of the item are modeled using a two dimensional approach where age and usage of the item and this requires a bi-variate distribution to modelling failures. When the item fails then corrective maintenance (CM) is minimally repaired. CM can be outsourced to an external agent or done in house. The decision problem for the owner is to find the maximum total profit whilst for the agent is to determine the optimal price of the contract. We obtain the mathematical models of the decision problems for the owner as well as the agent using a Nash game theory formulation. (paper)

  16. Application of Z-Number Based Modeling in Psychological Research.

    Science.gov (United States)

    Aliev, Rafik; Memmedova, Konul

    2015-01-01

    Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS), Test of Attention (D2 Test), and Spielberger's Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  17. Improved dual sided doped memristor: modelling and applications

    Directory of Open Access Journals (Sweden)

    Anup Shrivastava

    2014-05-01

    Full Text Available Memristor as a novel and emerging electronic device having vast range of applications suffer from poor frequency response and saturation length. In this paper, the authors present a novel and an innovative device structure for the memristor with two active layers and its non-linear ionic drift model for an improved frequency response and saturation length. The authors investigated and compared the I–V characteristics for the proposed model with the conventional memristors and found better results in each case (different window functions for the proposed dual sided doped memristor. For circuit level simulation, they developed a SPICE model of the proposed memristor and designed some logic gates based on hybrid complementary metal oxide semiconductor memristive logic (memristor ratioed logic. The proposed memristor yields improved results in terms of noise margin, delay time and dynamic hazards than that of the conventional memristors (single active layer memristors.

  18. Pyramidal Normalization Filter: Visual Model With Applications To Image Understanding

    Science.gov (United States)

    Schenker, P. S.; Unangst, D. R.; Knaak, T. F.; Huntley, D. T.; Patterson, W. R.

    1982-12-01

    This paper introduces a new nonlinear filter model which has applications in low-level machine vision. We show that this model, which we designate the normalization filter, is the basis for non-directional, multiple spatial frequency channel resolved detection of image edge structure. We show that the results obtained in this procedure are in close correspondence to the zero-crossing sets of the Marr-Hildreth edge detector.6 By comparison to their model, ours has the additional feature of constant-contrast thresholding, viz., it is spatially brightness adaptive. We describe a highly efficient and flexible realization of the normalization filter based on Burt's algorithm for pyramidal filtering.18 We present illustrative experimental results that we have obtained with a computer implementation of this filter design.

  19. Polycrystalline CVD diamond device level modeling for particle detection applications

    International Nuclear Information System (INIS)

    Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.

    2016-01-01

    Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.

  20. Applications of the SWAT Model Special Section: Overview and Insights.

    Science.gov (United States)

    Gassman, Philip W; Sadeghi, Ali M; Srinivasan, Raghavan

    2014-01-01

    The Soil and Water Assessment Tool (SWAT) model has emerged as one of the most widely used water quality watershed- and river basin-scale models worldwide, applied extensively for a broad range of hydrologic and/or environmental problems. The international use of SWAT can be attributed to its flexibility in addressing water resource problems, extensive networking via dozens of training workshops and the several international conferences that have been held during the past decade, comprehensive online documentation and supporting software, and an open source code that can be adapted by model users for specific application needs. The catalyst for this special collection of papers was the 2011 International SWAT Conference & Workshops held in Toledo, Spain, which featured over 160 scientific presentations representing SWAT applications in 37 countries. This special collection presents 22 specific SWAT-related studies, most of which were presented at the 2011 SWAT Conference; it represents SWAT applications on five different continents, with the majority of studies being conducted in Europe and North America. The papers cover a variety of topics, including hydrologic testing at a wide range of watershed scales, transport of pollutants in northern European lowland watersheds, data input and routing method effects on sediment transport, development and testing of potential new model algorithms, and description and testing of supporting software. In this introduction to the special section, we provide a synthesis of these studies within four main categories: (i) hydrologic foundations, (ii) sediment transport and routing analyses, (iii) nutrient and pesticide transport, and (iv) scenario analyses. We conclude with a brief summary of key SWAT research and development needs. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  1. Modelling of Argon Cold Atmospheric Plasmas for Biomedical Applications

    Science.gov (United States)

    Atanasova, M.; Benova, E.; Degrez, G.; van der Mullen, J. A. M.

    2018-02-01

    Plasmas for biomedical applications are one of the newest fields of plasma utilization. Especially high is the interest toward plasma usage in medicine. Promising results are achieved in blood coagulation, wound healing, treatment of some forms of cancer, diabetic complications, etc. However, the investigations of the biomedical applications from biological and medical viewpoint are much more advanced than the studies on the dynamics of the plasma. In this work we aim to address some specific challenges in the field of plasma modelling, arising from biomedical applications - what are the plasma reactive species’ and electrical fields’ spatial distributions as well as their production mechanisms; what are the fluxes and energies of the various components of the plasma delivers to the treated surfaces; what is the gas flow pattern? The focus is on two devices, namely the capacitive coupled plasma jet and the microwave surface wave sustained discharge. The devices are representatives of the so called cold atmospheric plasmas (CAPs). These are discharges characterized by low gas temperature - less than 40°C at the point of application - and non-equilibrium chemistry.

  2. Stochastic linear hybrid systems: Modeling, estimation, and application

    Science.gov (United States)

    Seah, Chze Eng

    Hybrid systems are dynamical systems which have interacting continuous state and discrete state (or mode). Accurate modeling and state estimation of hybrid systems are important in many applications. We propose a hybrid system model, known as the Stochastic Linear Hybrid System (SLHS), to describe hybrid systems with stochastic linear system dynamics in each mode and stochastic continuous-state-dependent mode transitions. We then develop a hybrid estimation algorithm, called the State-Dependent-Transition Hybrid Estimation (SDTHE) algorithm, to estimate the continuous state and discrete state of the SLHS from noisy measurements. It is shown that the SDTHE algorithm is more accurate or more computationally efficient than existing hybrid estimation algorithms. Next, we develop a performance analysis algorithm to evaluate the performance of the SDTHE algorithm in a given operating scenario. We also investigate sufficient conditions for the stability of the SDTHE algorithm. The proposed SLHS model and SDTHE algorithm are illustrated to be useful in several applications. In Air Traffic Control (ATC), to facilitate implementations of new efficient operational concepts, accurate modeling and estimation of aircraft trajectories are needed. In ATC, an aircraft's trajectory can be divided into a number of flight modes. Furthermore, as the aircraft is required to follow a given flight plan or clearance, its flight mode transitions are dependent of its continuous state. However, the flight mode transitions are also stochastic due to navigation uncertainties or unknown pilot intents. Thus, we develop an aircraft dynamics model in ATC based on the SLHS. The SDTHE algorithm is then used in aircraft tracking applications to estimate the positions/velocities of aircraft and their flight modes accurately. Next, we develop an aircraft conformance monitoring algorithm to detect any deviations of aircraft trajectories in ATC that might compromise safety. In this application, the SLHS

  3. Prognostic models in obstetrics: available, but far from applicable.

    Science.gov (United States)

    Kleinrouweler, C Emily; Cheong-See, Fiona M; Collins, Gary S; Kwee, Anneke; Thangaratinam, Shakila; Khan, Khalid S; Mol, Ben Willem J; Pajkrt, Eva; Moons, Karel G M; Schuit, Ewoud

    2016-01-01

    Health care provision is increasingly focused on the prediction of patients' individual risk for developing a particular health outcome in planning further tests and treatments. There has been a steady increase in the development and publication of prognostic models for various maternal and fetal outcomes in obstetrics. We undertook a systematic review to give an overview of the current status of available prognostic models in obstetrics in the context of their potential advantages and the process of developing and validating models. Important aspects to consider when assessing a prognostic model are discussed and recommendations on how to proceed on this within the obstetric domain are given. We searched MEDLINE (up to July 2012) for articles developing prognostic models in obstetrics. We identified 177 papers that reported the development of 263 prognostic models for 40 different outcomes. The most frequently predicted outcomes were preeclampsia (n = 69), preterm delivery (n = 63), mode of delivery (n = 22), gestational hypertension (n = 11), and small-for-gestational-age infants (n = 10). The performance of newer models was generally not better than that of older models predicting the same outcome. The most important measures of predictive accuracy (ie, a model's discrimination and calibration) were often (82.9%, 218/263) not both assessed. Very few developed models were validated in data other than the development data (8.7%, 23/263). Only two-thirds of the papers (62.4%, 164/263) presented the model such that validation in other populations was possible, and the clinical applicability was discussed in only 11.0% (29/263). The impact of developed models on clinical practice was unknown. We identified a large number of prognostic models in obstetrics, but there is relatively little evidence about their performance, impact, and usefulness in clinical practice so that at this point, clinical implementation cannot be recommended. New efforts should be directed

  4. Technology, applications and modelling of ohmic heating: a review.

    Science.gov (United States)

    Varghese, K Shiby; Pandey, M C; Radhakrishna, K; Bawa, A S

    2014-10-01

    Ohmic heating or Joule heating has immense potential for achieving rapid and uniform heating in foods, providing microbiologically safe and high quality foods. This review discusses the technology behind ohmic heating, the current applications and thermal modeling of the process. The success of ohmic heating depends on the rate of heat generation in the system, the electrical conductivity of the food, electrical field strength, residence time and the method by which the food flows through the system. Ohmic heating is appropriate for processing of particulate and protein rich foods. A vast amount of work is still necessary to understand food properties in order to refine system design and maximize performance of this technology in the field of packaged foods and space food product development. Various economic studies will also play an important role in understanding the overall cost and viability of commercial application of this technology in food processing. Some of the demerits of the technology are also discussed.

  5. Twin support vector machines models, extensions and applications

    CERN Document Server

    Jayadeva; Chandra, Suresh

    2017-01-01

    This book provides a systematic and focused study of the various aspects of twin support vector machines (TWSVM) and related developments for classification and regression. In addition to presenting most of the basic models of TWSVM and twin support vector regression (TWSVR) available in the literature, it also discusses the important and challenging applications of this new machine learning methodology. A chapter on “Additional Topics” has been included to discuss kernel optimization and support tensor machine topics, which are comparatively new but have great potential in applications. It is primarily written for graduate students and researchers in the area of machine learning and related topics in computer science, mathematics, electrical engineering, management science and finance.

  6. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  7. Multivariate Birnbaum-Saunders Distributions: Modelling and Applications

    Directory of Open Access Journals (Sweden)

    Robert G. Aykroyd

    2018-03-01

    Full Text Available Since its origins and numerous applications in material science, the Birnbaum–Saunders family of distributions has now found widespread uses in some areas of the applied sciences such as agriculture, environment and medicine, as well as in quality control, among others. It is able to model varied data behaviour and hence provides a flexible alternative to the most usual distributions. The family includes Birnbaum–Saunders and log-Birnbaum–Saunders distributions in univariate and multivariate versions. There are now well-developed methods for estimation and diagnostics that allow in-depth analyses. This paper gives a detailed review of existing methods and of relevant literature, introducing properties and theoretical results in a systematic way. To emphasise the range of suitable applications, full analyses are included of examples based on regression and diagnostics in material science, spatial data modelling in agricultural engineering and control charts for environmental monitoring. However, potential future uses in new areas such as business, economics, finance and insurance are also discussed. This work is presented to provide a full tool-kit of novel statistical models and methods to encourage other researchers to implement them in these new areas. It is expected that the methods will have the same positive impact in the new areas as they have had elsewhere.

  8. Supply chain management models, applications, and research directions

    CERN Document Server

    Pardalos, Panos; Romeijn, H

    2005-01-01

    This work brings together some of the most up to date research in the application of operations research and mathematical modeling te- niques to problems arising in supply chain management and e-Commerce. While research in the broad area of supply chain management enc- passes a wide range of topics and methodologies, we believe this book provides a good snapshot of current quantitative modeling approaches, issues, and trends within the field. Each chapter is a self-contained study of a timely and relevant research problem in supply chain mana- ment. The individual works place a heavy emphasis on the application of modeling techniques to real world management problems. In many instances, the actual results from applying these techniques in practice are highlighted. In addition, each chapter provides important mana- rial insights that apply to general supply chain management practice. The book is divided into three parts. The first part contains ch- ters that address the new and rapidly growing role of the inte...

  9. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  10. Modular coupling of transport and chemistry: theory and model applications

    International Nuclear Information System (INIS)

    Pfingsten, W.

    1994-06-01

    For the description of complex processes in the near-field of a radioactive waste repository, the coupling of transport and chemistry is necessary. A reason for the relatively minor use of coupled codes in this area is the high amount of computer time and storage capacity necessary for calculations by conventional codes, and lack of available data. The simple application of the sequentially coupled code MCOTAC, which couples one-dimensional advective, dispersive and diffusive transport with chemical equilibrium complexation and precipitation/dissolution reactions in a porous medium, shows some promising features with respect to applicability to relevant problems. Transport, described by a random walk of multi-species particles, and chemical equilibrium calculations are solved separately, coupled only by an exchange term to ensure mass conservation. The modular-structured code was applied to three problems: a) incongruent dissolution of hydrated silicate gels, b) dissolution of portlandite and c) calcite dissolution and hypothetical dolomite precipitation. This allows for a comparison with other codes and their applications. The incongruent dissolution of cement phases, important for degradation of cementitious materials in a repository, can be included in the model without the problems which occur with a directly coupled code. The handling of a sharp multi-mineral front system showed a much faster calculation time compared to a directly coupled code application. Altogether, the results are in good agreement with other code calculations. Hence, the chosen modular concept of MCOTAC is more open to an easy extension of the code to include additional processes like sorption, kinetically controlled processes, transport in two or three spatial dimensions, and adaptation to new developments in computing (hardware and software), an important factor for applicability. (author) figs., tabs., refs

  11. Modern EMC analysis techniques II models and applications

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of modern real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, numerical investigations delve into printed circuit boards, monolithic microwave integrated circuits, radio frequency microelectro

  12. Application of a mathematical model for ergonomics in lean manufacturing.

    Science.gov (United States)

    Botti, Lucia; Mora, Cristina; Regattieri, Alberto

    2017-10-01

    The data presented in this article are related to the research article "Integrating ergonomics and lean manufacturing principles in a hybrid assembly line" (Botti et al., 2017) [1]. The results refer to the application of the mathematical model for the design of lean processes in hybrid assembly lines, meeting both the lean principles and the ergonomic requirements for safe assembly work. Data show that the success of a lean strategy is possible when ergonomics of workers is a parameter of the assembly process design.

  13. Cognitive interference modeling with applications in power and admission control

    KAUST Repository

    Mahmood, Nurul Huda

    2012-10-01

    One of the key design challenges in a cognitive radio network is controlling the interference generated at coexisting primary receivers. In order to design efficient cognitive radio systems and to minimize their unwanted consequences, it is therefore necessary to effectively control the secondary interference at the primary receivers. In this paper, a generalized framework for the interference analysis of a cognitive radio network where the different secondary transmitters may transmit with different powers and transmission probabilities, is presented and various applications of this interference model are demonstrated. The findings of the analytical performance analyses are confirmed through selected computer-based Monte-Carlo simulations. © 2012 IEEE.

  14. Modeling & imaging of bioelectrical activity principles and applications

    CERN Document Server

    He, Bin

    2010-01-01

    Over the past several decades, much progress has been made in understanding the mechanisms of electrical activity in biological tissues and systems, and for developing non-invasive functional imaging technologies to aid clinical diagnosis of dysfunction in the human body. The book will provide full basic coverage of the fundamentals of modeling of electrical activity in various human organs, such as heart and brain. It will include details of bioelectromagnetic measurements and source imaging technologies, as well as biomedical applications. The book will review the latest trends in

  15. Application of nuclear models to neutron nuclear cross section calculations

    International Nuclear Information System (INIS)

    Young, P.G.

    1983-01-01

    Nuclear theory is used increasingly to supplement and extend the nuclear data base that is available for applied studies. Areas where theoretical calculations are most important include the determination of neutron cross sections for unstable fission products and transactinide nuclei in fission reactor or nuclear waste calculations and for meeting the extensive dosimetry, activation, and neutronic data needs associated with fusion reactor development, especially for neutron energies above 14 MeV. Considerable progress has been made in the use of nuclear models for data evaluation and, particularly, in the methods used to derive physically meaningful parameters for model calculations. Theoretical studies frequently involve use of spherical and deformed optical models, Hauser-Feshbach statistical theory, preequilibrium theory, direct-reaction theory and often make use of gamma-ray strength function models and phenomenological (or microscopic) level density prescriptions. The development, application and limitations of nuclear models for data evaluation are discussed in this paper, with emphasis on the 0.1 to 50 MeV energy range. (Auth.)

  16. Nonequilibrium thermodynamic models and applications to hydrogen plasma

    International Nuclear Information System (INIS)

    Cho, K.Y.

    1988-01-01

    A generalized multithermal equilibrium (GMTE) thermodynamic model is developed and presented with applications to hydrogen. A new chemical equilibrium equation for GMTE is obtained without the ensemble temperature concept, used by a previous MTE model. The effects of the GMTE model on the derivation and calculation of the thermodynamic, transport, and radiative properties are presented and significant differences from local thermal equilibrium (LTE) and two temperature model are discussed. When the electron translational temperature (T e ) is higher than the translational temperature of the heavy particles, the effects of hydrogen molecular species to the properties are significant at high T e compared with LTE results. The density variations of minor species are orders of magnitude with kinetic nonequilibrium at a constant electron temperature. A collisional-radiative model is also developed with the GMTE chemical equilibrium equation to study the effects of radiative transfer and the ambipolar diffusion on the population distribution of the excited atoms. The nonlocal radiative transfer effect is parameterized by an absorption factor, which is defined as a ratio of the absorbed intensity to the spontaneous emission coefficient

  17. Development of dynamic Bayesian models for web application test management

    Science.gov (United States)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  18. Web Based Laboratory Task-Submitter Application Model

    Directory of Open Access Journals (Sweden)

    Soetam Rizky Wicaksono

    2010-04-01

    Full Text Available Teaching learning process in laboratory is obligatory in engineering education especially for course in information technology (IT. To make laboratory activities become more interesting for the students, lecturers must build application-based exercise for the student, but after students accomplished their short exercise, it is found that lecturers got difficulty to compile and grade all the exercises. This paper is based on the idea how to overcome the problem above, in order to make activities in the laboratory comfortable for both lectures and student. It is expected that the lectures will able to fully control all practical activity and save the time automatically. To make approach to solve the problem in IT, a modeling process must be conducted first. There for, firstly, this paper explain the model approach for the problem above, then the IT design for practices in laboratory is described. The IT design to overcome the problem has been effectively applied in the real teaching learning process.

  19. Soft Computing Models in Industrial and Environmental Applications

    CERN Document Server

    Abraham, Ajith; Corchado, Emilio; 7th International Conference, SOCO’12

    2013-01-01

    This volume of Advances in Intelligent and Soft Computing contains accepted papers presented at SOCO 2012, held in the beautiful and historic city of Ostrava (Czech Republic), in September 2012.   Soft computing represents a collection or set of computational techniques in machine learning, computer science and some engineering disciplines, which investigate, simulate, and analyze very complex issues and phenomena.   After a through peer-review process, the SOCO 2012 International Program Committee selected 75 papers which are published in these conference proceedings, and represents an acceptance rate of 38%. In this relevant edition a special emphasis was put on the organization of special sessions. Three special sessions were organized related to relevant topics as: Soft computing models for Control Theory & Applications in Electrical Engineering, Soft computing models for biomedical signals and data processing and Advanced Soft Computing Methods in Computer Vision and Data Processing.   The selecti...

  20. Modeling lifetime of high power IGBTs in wind power applications

    DEFF Research Database (Denmark)

    Busca, Cristian

    2011-01-01

    The wind power industry is continuously developing bringing to the market larger and larger wind turbines. Nowadays reliability is more of a concern than in the past especially for the offshore wind turbines since the access to offshore wind turbines in case of failures is both costly and difficult....... Lifetime modeling of future large wind turbines is needed in order to make reliability predictions about these new wind turbines early in the design phase. By doing reliability prediction in the design phase the manufacturer can ensure that the new wind turbines will live long enough. This paper represents...... an overview of the different aspects of lifetime modeling of high power IGBTs in wind power applications. In the beginning, wind turbine reliability survey results are briefly reviewed in order to gain an insight into wind turbine subassembly failure rates and associated downtimes. After that the most common...

  1. Economic model predictive control theory, formulations and chemical process applications

    CERN Document Server

    Ellis, Matthew; Christofides, Panagiotis D

    2017-01-01

    This book presents general methods for the design of economic model predictive control (EMPC) systems for broad classes of nonlinear systems that address key theoretical and practical considerations including recursive feasibility, closed-loop stability, closed-loop performance, and computational efficiency. Specifically, the book proposes: Lyapunov-based EMPC methods for nonlinear systems; two-tier EMPC architectures that are highly computationally efficient; and EMPC schemes handling explicitly uncertainty, time-varying cost functions, time-delays and multiple-time-scale dynamics. The proposed methods employ a variety of tools ranging from nonlinear systems analysis, through Lyapunov-based control techniques to nonlinear dynamic optimization. The applicability and performance of the proposed methods are demonstrated through a number of chemical process examples. The book presents state-of-the-art methods for the design of economic model predictive control systems for chemical processes. In addition to being...

  2. Solid modeling and applications rapid prototyping, CAD and CAE theory

    CERN Document Server

    Um, Dugan

    2016-01-01

    The lessons in this fundamental text equip students with the theory of Computer Assisted Design (CAD), Computer Assisted Engineering (CAE), the essentials of Rapid Prototyping, as well as practical skills needed to apply this understanding in real world design and manufacturing settings. The book includes three main areas: CAD, CAE, and Rapid Prototyping, each enriched with numerous examples and exercises. In the CAD section, Professor Um outlines the basic concept of geometric modeling, Hermite and Bezier Spline curves theory, and 3-dimensional surface theories as well as rendering theory. The CAE section explores mesh generation theory, matrix notion for FEM, the stiffness method, and truss Equations. And in Rapid Prototyping, the author illustrates stereo lithographic theory and introduces popular modern RP technologies. Solid Modeling and Applications: Rapid Prototyping, CAD and CAE Theory is ideal for university students in various engineering disciplines as well as design engineers involved in product...

  3. Joint Dynamics Modeling and Parameter Identification for Space Robot Applications

    Directory of Open Access Journals (Sweden)

    Adenilson R. da Silva

    2007-01-01

    Full Text Available Long-term mission identification and model validation for in-flight manipulator control system in almost zero gravity with hostile space environment are extremely important for robotic applications. In this paper, a robot joint mathematical model is developed where several nonlinearities have been taken into account. In order to identify all the required system parameters, an integrated identification strategy is derived. This strategy makes use of a robust version of least-squares procedure (LS for getting the initial conditions and a general nonlinear optimization method (MCS—multilevel coordinate search—algorithm to estimate the nonlinear parameters. The approach is applied to the intelligent robot joint (IRJ experiment that was developed at DLR for utilization opportunity on the International Space Station (ISS. The results using real and simulated measurements have shown that the developed algorithm and strategy have remarkable features in identifying all the parameters with good accuracy.

  4. Application of data envelopment analysis models in supply chain management

    DEFF Research Database (Denmark)

    Soheilirad, Somayeh; Govindan, Kannan; Mardani, Abbas

    2017-01-01

    have been attained to reach a comprehensive review of data envelopment analysis models in evaluation supply chain management. Consequently, the selected published articles have been categorized by author name, the year of publication, technique, application area, country, scope, data envelopment...... analysis purpose, study purpose, research gap and contribution, results and outcome, and journals and conferences in which they appeared. The results of this study indicated that areas of supplier selection, supply chain efficiency and sustainable supply chain have had the highest frequently than other...... of interest as a mathematical tool to evaluate supply chain management. While, various data envelopment analysis models have been suggested to measure and evaluate the supply chain management, there is a lack of research regarding to systematic literature review and classification of study in this field...

  5. Chemical kinetic modeling of H{sub 2} applications

    Energy Technology Data Exchange (ETDEWEB)

    Marinov, N.M.; Westbrook, C.K.; Cloutman, L.D. [Lawrence Livermore National Lab., CA (United States)] [and others

    1995-09-01

    Work being carried out at LLNL has concentrated on studies of the role of chemical kinetics in a variety of problems related to hydrogen combustion in practical combustion systems, with an emphasis on vehicle propulsion. Use of hydrogen offers significant advantages over fossil fuels, and computer modeling provides advantages when used in concert with experimental studies. Many numerical {open_quotes}experiments{close_quotes} can be carried out quickly and efficiently, reducing the cost and time of system development, and many new and speculative concepts can be screened to identify those with sufficient promise to pursue experimentally. This project uses chemical kinetic and fluid dynamic computational modeling to examine the combustion characteristics of systems burning hydrogen, either as the only fuel or mixed with natural gas. Oxidation kinetics are combined with pollutant formation kinetics, including formation of oxides of nitrogen but also including air toxics in natural gas combustion. We have refined many of the elementary kinetic reaction steps in the detailed reaction mechanism for hydrogen oxidation. To extend the model to pressures characteristic of internal combustion engines, it was necessary to apply theoretical pressure falloff formalisms for several key steps in the reaction mechanism. We have continued development of simplified reaction mechanisms for hydrogen oxidation, we have implemented those mechanisms into multidimensional computational fluid dynamics models, and we have used models of chemistry and fluid dynamics to address selected application problems. At the present time, we are using computed high pressure flame, and auto-ignition data to further refine the simplified kinetics models that are then to be used in multidimensional fluid mechanics models. Detailed kinetics studies have investigated hydrogen flames and ignition of hydrogen behind shock waves, intended to refine the detailed reactions mechanisms.

  6. Modelling climate change policies : an application of ENERGY2020

    International Nuclear Information System (INIS)

    Timilsina, G.; Bhargava, A.; Backus, G.

    2005-01-01

    Researches and policy-makers are increasingly analyzing the economic impacts of the Kyoto Protocol at national, regional and global levels. The analyses are generally based on numerical models integrating energy, environment and the economy. Most models range from partial equilibrium types to complex multi-sector general equilibrium models, and typically represent the energy sector at an aggregate level, which limits their ability to reflect details of different sectors. In Canada, a model called ENERGY2020 has been widely used by the federal and provincial governments to analyze the sectoral and provincial impacts of implementing the Kyoto Protocol. ENERGY2020 uses stocks and flows simulation that captures the physical aspects of the processes utilizing energy, as well as the qualitative choice theory which captures human behavioural aspects. The model also has a database containing 20 years of time-series on all economic, environmental and energy variables, enabling the model to derive most parameters endogenously through econometric estimations. It has the capacity to analyze consumer and business responses over a wide range of policy initiatives such as energy environment taxes, regulatory standards for buildings, equipment and motor vehicles, grants, rebates and subsidy initiatives, consumer awareness initiatives, technology improvements, moratoriums and mandated cut-backs. It is also capable of producing long-term energy market forecasts as well as analyzing the impacts of policies in the markets. It was concluded that the model's application will serve as a useful analytical tool for a range of issues, and may be useful to developing countries and economies in transition. 6 refs., 5 figs

  7. Surrogate Model for Recirculation Phase LBLOCA and DET Application

    International Nuclear Information System (INIS)

    Fynan, Douglas A; Ahn, Kwang-Il; Lee, John C.

    2014-01-01

    In the nuclear safety field, response surfaces were used in the first demonstration of the code scaling, applicability, and uncertainty (CSAU) methodology to quantify the uncertainty of the peak clad temperature (PCT) during a large-break loss-of-coolant accident (LBLOCA). Surrogates could have applications in other nuclear safety areas such as dynamic probabilistic safety assessment (PSA). Dynamic PSA attempts to couple the probabilistic nature of failure events, component transitions, and human reliability to deterministic calculations of time-dependent nuclear power plant (NPP) responses usually through the use of thermal-hydraulic (TH) system codes. The overall mathematical complexity of the dynamic PSA architectures with many embedded computational expensive TH code calculations with large input/output data streams have limited realistic studies of NPPs. This paper presents a time-dependent surrogate model for the recirculation phase of a hot leg LBLOCA in the OPR-1000. The surrogate model is developed through the ACE algorithm, a powerful nonparametric regression technique, trained on RELAP5 simulations of the LBLOCA. Benchmarking of the surrogate is presented and an application to a simplified dynamic event tree (DET). A time-dependent surrogate model to predict core subcooling during the recirculation phase of a hot leg LBLOCA in the OPR-1000 has been developed. The surrogate assumed the structure of a general discrete time dynamic model and learned the nonlinear functional form by performing nonparametric regression on RELAP5 simulations with the ACE algorithm. The surrogate model input parameters represent mass and energy flux terms to the RCS that appeared as user supplied or code calculated boundary conditions in the RELAP5 model. The surrogate accurately predicted the TH behavior of the core for a variety of HPSI system performance and containment conditions when compared with RELAP5 simulations. The surrogate was applied in a DET application replacing

  8. Acoustic Propagation Modeling for Marine Hydro-Kinetic Applications

    Science.gov (United States)

    Johnson, C. N.; Johnson, E.

    2014-12-01

    The combination of riverine, tidal, and wave energy have the potential to supply over one third of the United States' annual electricity demand. However, in order to deploy and test prototypes, and commercial installations, marine hydrokinetic (MHK) devices must meet strict regulatory guidelines that determine the maximum amount of noise that can be generated and sets particular thresholds for determining disturbance and injury caused by noise. An accurate model for predicting the propagation of a MHK source in a real-life hydro-acoustic environment has been established. This model will help promote the growth and viability of marine, water, and hydrokinetic energy by confidently assuring federal regulations are meet and harmful impacts to marine fish and wildlife are minimal. Paracousti, a finite difference solution to the acoustic equations, was originally developed for sound propagation in atmospheric environments and has been successfully validated for a number of different geophysical activities. The three-dimensional numerical implementation is advantageous over other acoustic propagation techniques for a MHK application where the domains of interest have complex 3D interactions from the seabed, banks, and other shallow water effects. A number of different cases for hydro-acoustic environments have been validated by both analytical and numerical results from canonical and benchmark problems. This includes a variety of hydrodynamic and physical environments that may be present in a potential MHK application including shallow and deep water, sloping, and canyon type bottoms, with varying sound speed and density profiles. With the model successfully validated for hydro-acoustic environments more complex and realistic MHK sources from turbines and/or arrays can be modeled.

  9. Applications of a simulation model to decisions in mallard management

    Science.gov (United States)

    Cowardin, L.M.; Johnson, D.H.; Shaffer, T.L.; Sparling, D.W.

    1988-01-01

    A system comprising simulation models and data bases for habitat availability and nest success rates was used to predict results from a mallard (Anas platyrhynchos) management plan and to compare six management methods with a control. Individual treatments in the applications included land purchase for waterfowl production, wetland easement purchase, lease of uplands for waterfowl management, cropland retirement, use of no-till winter wheat, delayed cutting of alfalfa, installation of nest baskets, nesting island construction, and use of predator-resistant fencing.The simulations predicted that implementation of the management plan would increase recruits by 24%. Nest baskets were the most effective treatment, accounting for 20.4% of the recruits. No-till winter wheat was the second most effective, accounting for 5.9% of the recruits. Wetland loss due to drainage would cause an 11% loss of breeding population in 10 years.The models were modified to account for migrational homing. The modification indicated that migrational homing would enhance the effects of management. Nest success rates were critical contributions to individual management methods. The most effective treatments, such as nest baskets, had high success rates and affected a large portion of the breeding population.Economic analyses indicated that nest baskets would be the most economical of the three techniques tested. The applications indicated that the system is a useful tool to aid management decisions, but data are scarce for several important variables. Basic research will be required to adequately model the effect of migrational homing and density dependence on production. The comprehensive nature of predictions desired by managers will also require that production models like the one described here be extended to encompass the entire annual cycle of waterfowl.

  10. Open Data in Mobile Applications, New Models for Service Information

    Directory of Open Access Journals (Sweden)

    Manuel GÉRTRUDIX BARRIO

    2016-06-01

    Full Text Available The combination of open data generated by government and the proliferation of mobile devices enables the creation of new information services and improved delivery of existing ones. Significantly, it allows citizens access to simple,quick and effective way to information. Free applications that use open data provide useful information in real time, tailored to the user experience and / or geographic location. This changes the concept of “service information”. Both the infomediary sector and citizens now have new models of production and dissemination of this type of information. From the theoretical contextualization of aspects such as processes datification of reality, mobile registration of everyday experience, or reinterpretation of the service information, we analyze the role of open data in the public sector in Spain and its application concrete in building apps based on this data sets. The findings indicate that this is a phenomenon that will continue to grow because these applications provide useful and efficient information to decision-making in everyday life.

  11. Applications of Skew Models Using Generalized Logistic Distribution

    Directory of Open Access Journals (Sweden)

    Pushpa Narayan Rathie

    2016-04-01

    Full Text Available We use the skew distribution generation procedure proposed by Azzalini [Scand. J. Stat., 1985, 12, 171–178] to create three new probability distribution functions. These models make use of normal, student-t and generalized logistic distribution, see Rathie and Swamee [Technical Research Report No. 07/2006. Department of Statistics, University of Brasilia: Brasilia, Brazil, 2006]. Expressions for the moments about origin are derived. Graphical illustrations are also provided. The distributions derived in this paper can be seen as generalizations of the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. Applications with unimodal and bimodal data are given to illustrate the applicability of the results derived in this paper. The applications include the analysis of the following data sets: (a spending on public education in various countries in 2003; (b total expenditure on health in 2009 in various countries and (c waiting time between eruptions of the Old Faithful Geyser in the Yellow Stone National Park, Wyoming, USA. We compare the fit of the distributions introduced in this paper with the distributions given by Nadarajah and Kotz [Acta Appl. Math., 2006, 91, 1–37]. The results show that our distributions, in general, fit better the data sets. The general R codes for fitting the distributions introduced in this paper are given in Appendix A.

  12. Multi-level decision making models, methods and applications

    CERN Document Server

    Zhang, Guangquan; Gao, Ya

    2015-01-01

    This monograph presents new developments in multi-level decision-making theory, technique and method in both modeling and solution issues. It especially presents how a decision support system can support managers in reaching a solution to a multi-level decision problem in practice. This monograph combines decision theories, methods, algorithms and applications effectively. It discusses in detail the models and solution algorithms of each issue of bi-level and tri-level decision-making, such as multi-leaders, multi-followers, multi-objectives, rule-set-based, and fuzzy parameters. Potential readers include organizational managers and practicing professionals, who can use the methods and software provided to solve their real decision problems; PhD students and researchers in the areas of bi-level and multi-level decision-making and decision support systems; students at an advanced undergraduate, master’s level in information systems, business administration, or the application of computer science.  

  13. Modeling multibody systems with uncertainties. Part II: Numerical applications

    Energy Technology Data Exchange (ETDEWEB)

    Sandu, Corina, E-mail: csandu@vt.edu; Sandu, Adrian; Ahmadian, Mehdi [Virginia Polytechnic Institute and State University, Mechanical Engineering Department (United States)

    2006-04-15

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties.

  14. Modeling multibody systems with uncertainties. Part II: Numerical applications

    International Nuclear Information System (INIS)

    Sandu, Corina; Sandu, Adrian; Ahmadian, Mehdi

    2006-01-01

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties

  15. [Application of DNDC model in estimating cropland nitrate leaching].

    Science.gov (United States)

    Li, Hu; Wang, Li-Gang; Qiu, Jian-Jun

    2009-07-01

    The leaching amount of soil water and nitrate from winter wheat field under typical planting system in Jinan City of Shandong Province was measured with lysimeter during the whole growth season in 2008, and the feasibility of applying DNDC model to estimate this leaching amount was tested by the obtained data. On the whole, DNDC model could better simulate the soil water movement in the crop field, with the accuracy being acceptable. However, there existed definite deviation in the simulation of nitrate leaching. The simulated value (18.35 kg N x hm(-2)) was 3.46 kg N x hm(-2) higher than the observed value (14.89 kg N x hm(-2)), with a relative error of about 20%, which suggested that some related parameters were required to be further modified. The sensitivity test of DNDC model showed that cropland nitrate leaching was easily to be affected by irrigation and fertilization. It was proved that the model had definite applicability in the study area.

  16. Analysis and application of opinion model with multiple topic interactions.

    Science.gov (United States)

    Xiong, Fei; Liu, Yun; Wang, Liang; Wang, Ximeng

    2017-08-01

    To reveal heterogeneous behaviors of opinion evolution in different scenarios, we propose an opinion model with topic interactions. Individual opinions and topic features are represented by a multidimensional vector. We measure an agent's action towards a specific topic by the product of opinion and topic feature. When pairs of agents interact for a topic, their actions are introduced to opinion updates with bounded confidence. Simulation results show that a transition from a disordered state to a consensus state occurs at a critical point of the tolerance threshold, which depends on the opinion dimension. The critical point increases as the dimension of opinions increases. Multiple topics promote opinion interactions and lead to the formation of macroscopic opinion clusters. In addition, more topics accelerate the evolutionary process and weaken the effect of network topology. We use two sets of large-scale real data to evaluate the model, and the results prove its effectiveness in characterizing a real evolutionary process. Our model achieves high performance in individual action prediction and even outperforms state-of-the-art methods. Meanwhile, our model has much smaller computational complexity. This paper provides a demonstration for possible practical applications of theoretical opinion dynamics.

  17. Application of modified vector fitting to grounding system modeling

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, D.; Camargo, M.; Herrera, J.; Torres, H. [National University of Colombia (Colombia). Research Program on Acquisition and Analysis of Signals - PAAS], Emails: dyjimeneza@unal.edu.co, mpcamargom@unal.edu.co; Vargas, M. [Siemens S.A. - Power Transmission and Distribution - Energy Services (Colombia)

    2007-07-01

    The transient behavior of grounding systems (GS) influences greatly the performance of electrical networks under fault conditions. This fact has led the authors to present an application of the Modified Vector Fitting (MVF)1 methodology based upon the frequency response of the system, in order to find a rational function approximation and an equivalent electrical network whose transient behavior is similar to the original one of the GS. The obtained network can be introduced into the EMTP/ATP program for simulating the transient behavior of the GS. The MVF technique, which is a modification of the Vector Fitting (VF) technique, allows identifying state space models from the Frequency Domain Response for both single and multiple input-output systems. In this work, the methodology is used to fit the frequency response of a grounding grid, which is computed by means of the Hybrid Electromagnetic Model (HEM), finding the relation between voltages and input currents in two points of the grid in frequency domain. The model obtained with the MVF shows a good agreement with the frequency response of the GS. Besides, the model is tested in EMTP/ATP finding a good fitting with the calculated data, which demonstrates the validity and usefulness of the MVF. (author)

  18. Global Modeling of CO2 Discharges with Aerospace Applications

    Directory of Open Access Journals (Sweden)

    Chloe Berenguer

    2014-01-01

    Full Text Available We developed a global model aiming to study discharges in CO2 under various conditions, pertaining to a large spectrum of pressure, absorbed energy, and feeding values. Various physical conditions and form factors have been investigated. The model was applied to a case of radiofrequency discharge and to helicon type devices functioning in low and high feed conditions. In general, main charged species were found to be CO2+ for sufficiently low pressure cases and O− for higher pressure ones, followed by CO2+, CO+, and O2+ in the latter case. Dominant reaction is dissociation of CO2 resulting into CO production. Electronegativity, important for radiofrequency discharges, increases with pressure, arriving up to 3 for high flow rates for absorbed power of 250 W, and diminishes with increasing absorbed power. Model results pertaining to radiofrequency type plasma discharges are found in satisfactory agreement with those available from an existing experiment. Application to low and high flow rates feedings cases of helicon thruster allowed for evaluation of thruster functioning conditions pertaining to absorbed powers from 50 W to 1.8 kW. The model allows for a detailed evaluation of the CO2 potential to be used as propellant in electric propulsion devices.

  19. Impedance characterization and modeling of electrodes for biomedical applications.

    Science.gov (United States)

    Franks, Wendy; Schenker, Iwan; Schmutz, Patrik; Hierlemann, Andreas

    2005-07-01

    A low electrode-electrolyte impedance interface is critical in the design of electrodes for biomedical applications. To design low-impedance interfaces a complete understanding of the physical processes contributing to the impedance is required. In this work a model describing these physical processes is validated and extended to quantify the effect of organic coatings and incubation time. Electrochemical impedance spectroscopy has been used to electrically characterize the interface for various electrode materials: platinum, platinum black, and titanium nitride; and varying electrode sizes: 1 cm2, and 900 microm2. An equivalent circuit model comprising an interface capacitance, shunted by a charge transfer resistance, in series with the solution resistance has been fitted to the experimental results. Theoretical equations have been used to calculate the interface capacitance impedance and the solution resistance, yielding results that correspond well with the fitted parameter values, thereby confirming the validity of the equations. The effect of incubation time, and two organic cell-adhesion promoting coatings, poly-L-lysine and laminin, on the interface impedance has been quantified using the model. This demonstrates the benefits of using this model in developing better understanding of the physical processes occurring at the interface in more complex, biomedically relevant situations.

  20. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of

  1. Application of GPS Measurements for Ionospheric and Tropospheric Modelling

    Science.gov (United States)

    Rajendra Prasad, P.; Abdu, M. A.; Furlan, Benedito. M. P.; Koiti Kuga, Hélio

    solar maximum period. In the equatorial region the irregularity structures are highly elongated in the north-south direction and are discrete in the east-west direction with dimensions of several hundred km. With such spatial distribution of irregularities needs to determine how often the GPS receivers fails to provide navigation aid with the available constellation. The effects of scintillation on the performance of GPS navigation systems in the equatorial region can be analyzed through commissioning few ground receivers. Incidentally there are few GPS receivers near these latitudes. Despite the recent advances in the ionosphere and tropospheric delay modeling for geodetic applications of GPS, the models currently used are not very precise. The conventional and operational ionosphere models viz. Klobuchar, Bent, and IRI models have certain limitations in providing very precise accuracies at all latitudes. The troposphere delay modeling also suffers in accuracy. The advances made in both computing power and knowledge of the atmosphere leads to make an effort to upgrade some of these models for improving delay corrections in GPS navigation. The ionospheric group delay corrections for orbit determination can be minimized using duel frequency. However in single frequency measurements the group delay correction is an involved task. In this paper an investigation is carried out to estimate the model coefficients of ionosphere along with precise orbit determination modeling using GPS measurements. The locations of the ground-based receivers near equator are known very exactly. Measurements from these ground stations to a precisely known satellite carrying duel receiver is used for orbit determination. The ionosphere model parameters can be refined corresponding to spatially distributed GPS receivers spread over Brazil. The tropospheric delay effects are not significant for the satellites by choosing appropriate elevation angle. However it needs to be analyzed for user like

  2. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    International Nuclear Information System (INIS)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo; Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho

    2003-01-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment

  3. An oral multispecies biofilm model for high content screening applications.

    Directory of Open Access Journals (Sweden)

    Nadine Kommerein

    Full Text Available Peri-implantitis caused by multispecies biofilms is a major complication in dental implant treatment. The bacterial infection surrounding dental implants can lead to bone loss and, in turn, to implant failure. A promising strategy to prevent these common complications is the development of implant surfaces that inhibit biofilm development. A reproducible and easy-to-use biofilm model as a test system for large scale screening of new implant surfaces with putative antibacterial potency is therefore of major importance. In the present study, we developed a highly reproducible in vitro four-species biofilm model consisting of the highly relevant oral bacterial species Streptococcus oralis, Actinomyces naeslundii, Veillonella dispar and Porphyromonas gingivalis. The application of live/dead staining, quantitative real time PCR (qRT-PCR, scanning electron microscopy (SEM and urea-NaCl fluorescence in situ hybridization (urea-NaCl-FISH revealed that the four-species biofilm community is robust in terms of biovolume, live/dead distribution and individual species distribution over time. The biofilm community is dominated by S. oralis, followed by V. dispar, A. naeslundii and P. gingivalis. The percentage distribution in this model closely reflects the situation in early native plaques and is therefore well suited as an in vitro model test system. Furthermore, despite its nearly native composition, the multispecies model does not depend on nutrient additives, such as native human saliva or serum, and is an inexpensive, easy to handle and highly reproducible alternative to the available model systems. The 96-well plate format enables high content screening for optimized implant surfaces impeding biofilm formation or the testing of multiple antimicrobial treatment strategies to fight multispecies biofilm infections, both exemplary proven in the manuscript.

  4. Micromagnetic Modeling and Analysis for Memory and Processing Applications

    Science.gov (United States)

    Lubarda, Marko V.

    Magnetic nanostructures are vital components of numerous existing and prospective magnetic devices, including hard disk drives, magnetic sensors, and microwave generators. The ability to examine and predict the behavior of magnetic nanostructures is essential for improving existing devices and exploring new technologies and areas of application. This thesis consists of three parts. In part I, key concepts of magnetism are covered (chapter 1), followed by an introduction to micromagnetics (chapter 2). Key interactions are discussed. The Landau-Lifshitz-Gilbert equation is introduced, and the variational approach of W. F. Brown is presented. Part II is devoted to computational micromagnetics. Interaction energies, fields and torques, introduced in part I, are transcribed from the continuum to their finite element form. The validity of developed models is discussed with reference to physical assumptions and discretization criteria. Chapter 3 introduces finite element modeling, and provides derivations of micromagnetic fields in the linear basis representation. Spin transfer torques are modeled in chapter 4. Thermal effects are included in the computational framework in chapter 5. Chapter 6 discusses an implementation of the nudged elastic band method for the computation of energy barriers. A model accounting for polycrystallinity is developed in chapter 7. The model takes into account the wide variety of distributions and imperfections which characterize true systems. The modeling presented in chapters 3-7 forms a general framework for the computational study of diverse magnetic phenomena in contemporary structures and devices. Chapter 8 concludes part II with an outline of powerful acceleration schemes, which were essential for the large-scale micromagnetic simulations presented in part III. Part III begins with the analysis of the perpendicular magnetic recording system (chapter 9). A simulation study of the recording process with readback analysis is presented

  5. [Application of three compartment model and response surface model to clinical anesthesia using Microsoft Excel].

    Science.gov (United States)

    Abe, Eiji; Abe, Mari

    2011-08-01

    With the spread of total intravenous anesthesia, clinical pharmacology has become more important. We report Microsoft Excel file applying three compartment model and response surface model to clinical anesthesia. On the Microsoft Excel sheet, propofol, remifentanil and fentanyl effect-site concentrations are predicted (three compartment model), and probabilities of no response to prodding, shaking, surrogates of painful stimuli and laryngoscopy are calculated using predicted effect-site drug concentration. Time-dependent changes in these calculated values are shown graphically. Recent development in anesthetic drug interaction studies are remarkable, and its application to clinical anesthesia with this Excel file is simple and helpful for clinical anesthesia.

  6. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2017-05-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  7. Exploring the Application of Capital Facility Investment Justification Model

    Directory of Open Access Journals (Sweden)

    Marijan Karić

    2013-07-01

    Full Text Available For decades now, the models for identifying and quantifying the level of risk of investment projects and investment justification evaluation have been the subject of investigation by members of professional and research communities. It is important to quantify the level of risk because by evaluating investment justification in terms of the risks involved, the decision-maker (investor is able to choose from available alternatives the one that will achieve the most favourable ratio of expected profit to the assumed risk. In this way, the economic entity can raise its productivity, profitability and the quality of business operation in general. The aim of this paper was to investigate the extent to which medium and large companies have been using modern methods of investment justification evaluation in their decision-making process and determine the level of quality of the application of the selected methods in practice. The study was conducted on a sample of medium and large enterprises in the eastern Croatia during 2011 and 2012, and it was established that despite the fact that a large number of modern investment project profitability and risk assessment models have been developed, the level of their application in practice is not high enough. The analyzed investment proposals included only basic methods of capital budgeting without risk assessment. Hence, it was concluded that individual investors were presented with low-quality and incomplete investment justification evaluation results on the basis of which the decisions of key importance for the development of the economic entity as a whole were made. This paper aims to underline the need for financial managers to get informed and educate themselves about contemporary investment project profitability and risk assessment models as well as the need to create educational programmes and computer solutions that will encourage key people in companies to acquire new knowledge and apply modern

  8. A Web Application for Validating and Disseminating Surface Energy Balance Evapotranspiration Estimates for Hydrologic Modeling Applications

    Science.gov (United States)

    Schneider, C. A.; Aggett, G. R.; Nevo, A.; Babel, N. C.; Hattendorf, M. J.

    2008-12-01

    The western United States face an increasing threat from drought - and the social, economic, and environmental impacts that come with it. The combination of diminished water supplies along with increasing demand for urban and other uses is rapidly depleting surface and ground water reserves traditionally allocated for agricultural use. Quantification of water consumptive use is increasingly important as water resources are placed under growing tension by increased users and interests. Scarce water supplies can be managed more efficiently through use of information and prediction tools accessible via the internet. METRIC (Mapping ET at high Resolution with Internalized Calibration) represents a maturing technology for deriving a remote sensing-based surface energy balance for estimating ET from the earth's surface. This technology has the potential to become widely adopted and used by water resources communities providing critical support to a host of water decision support tools. ET images created using METRIC or similar remote- sensing based processing systems could be routinely used as input to operational and planning models for water demand forecasting, reservoir operations, ground-water management, irrigation water supply planning, water rights regulation, and for the improvement, validation, and use of hydrological models. The ET modeling and subsequent validation and distribution of results via the web presented here provides a vehicle through which METRIC ET parameters can be made more accessible to hydrologic modelers. It will enable users of the data to assess the results of the spatially distributed ET modeling and compare with results from conventional ET estimation methods prior to assimilation in surface and ground water models. In addition, this ET-Server application will provide rapid and transparent access to the data enabling quantification of uncertainties due to errors in temporal sampling and METRIC modeling, while the GIS-based analytical

  9. Advanced applications of numerical modelling techniques for clay extruder design

    Science.gov (United States)

    Kandasamy, Saravanakumar

    Ceramic materials play a vital role in our day to day life. Recent advances in research, manufacture and processing techniques and production methodologies have broadened the scope of ceramic products such as bricks, pipes and tiles, especially in the construction industry. These are mainly manufactured using an extrusion process in auger extruders. During their long history of application in the ceramic industry, most of the design developments of extruder systems have resulted from expensive laboratory-based experimental work and field-based trial and error runs. In spite of these design developments, the auger extruders continue to be energy intensive devices with high operating costs. Limited understanding of the physical process involved in the process and the cost and time requirements of lab-based experiments were found to be the major obstacles in the further development of auger extruders.An attempt has been made herein to use Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) based numerical modelling techniques to reduce the costs and time associated with research into design improvement by experimental trials. These two techniques, although used widely in other engineering applications, have rarely been applied for auger extruder development. This had been due to a number of reasons including technical limitations of CFD tools previously available. Modern CFD and FEA software packages have much enhanced capabilities and allow the modelling of the flow of complex fluids such as clay.This research work presents a methodology in using Herschel-Bulkley's fluid flow based CFD model to simulate and assess the flow of clay-water mixture through the extruder and the die of a vacuum de-airing type clay extrusion unit used in ceramic extrusion. The extruder design and the operating parameters were varied to study their influence on the power consumption and the extrusion pressure. The model results were then validated using results from

  10. Three essays on multi-level optimization models and applications

    Science.gov (United States)

    Rahdar, Mohammad

    The general form of a multi-level mathematical programming problem is a set of nested optimization problems, in which each level controls a series of decision variables independently. However, the value of decision variables may also impact the objective function of other levels. A two-level model is called a bilevel model and can be considered as a Stackelberg game with a leader and a follower. The leader anticipates the response of the follower and optimizes its objective function, and then the follower reacts to the leader's action. The multi-level decision-making model has many real-world applications such as government decisions, energy policies, market economy, network design, etc. However, there is a lack of capable algorithms to solve medium and large scale these types of problems. The dissertation is devoted to both theoretical research and applications of multi-level mathematical programming models, which consists of three parts, each in a paper format. The first part studies the renewable energy portfolio under two major renewable energy policies. The potential competition for biomass for the growth of the renewable energy portfolio in the United States and other interactions between two policies over the next twenty years are investigated. This problem mainly has two levels of decision makers: the government/policy makers and biofuel producers/electricity generators/farmers. We focus on the lower-level problem to predict the amount of capacity expansions, fuel production, and power generation. In the second part, we address uncertainty over demand and lead time in a multi-stage mathematical programming problem. We propose a two-stage tri-level optimization model in the concept of rolling horizon approach to reducing the dimensionality of the multi-stage problem. In the third part of the dissertation, we introduce a new branch and bound algorithm to solve bilevel linear programming problems. The total time is reduced by solving a smaller relaxation

  11. Testing simulation and structural models with applications to energy demand

    Science.gov (United States)

    Wolff, Hendrik

    2007-12-01

    This dissertation deals with energy demand and consists of two parts. Part one proposes a unified econometric framework for modeling energy demand and examples illustrate the benefits of the technique by estimating the elasticity of substitution between energy and capital. Part two assesses the energy conservation policy of Daylight Saving Time and empirically tests the performance of electricity simulation. In particular, the chapter "Imposing Monotonicity and Curvature on Flexible Functional Forms" proposes an estimator for inference using structural models derived from economic theory. This is motivated by the fact that in many areas of economic analysis theory restricts the shape as well as other characteristics of functions used to represent economic constructs. Specific contributions are (a) to increase the computational speed and tractability of imposing regularity conditions, (b) to provide regularity preserving point estimates, (c) to avoid biases existent in previous applications, and (d) to illustrate the benefits of our approach via numerical simulation results. The chapter "Can We Close the Gap between the Empirical Model and Economic Theory" discusses the more fundamental question of whether the imposition of a particular theory to a dataset is justified. I propose a hypothesis test to examine whether the estimated empirical model is consistent with the assumed economic theory. Although the proposed methodology could be applied to a wide set of economic models, this is particularly relevant for estimating policy parameters that affect energy markets. This is demonstrated by estimating the Slutsky matrix and the elasticity of substitution between energy and capital, which are crucial parameters used in computable general equilibrium models analyzing energy demand and the impacts of environmental regulations. Using the Berndt and Wood dataset, I find that capital and energy are complements and that the data are significantly consistent with duality

  12. Global CLEWs model - A novel application of OSeMOSYS

    Science.gov (United States)

    Avgerinopoulos, Georgios; Pereira Ramos, Eunice; Howells, Mark

    2017-04-01

    Over the past years, studies that analyse Nexus issues from a holistic point of view and not energy, land or water separately have been gaining momentum. This project aims at giving insights into global issues through the application and the analysis of a global scale OSeMOSYS model. The latter -which is based on a fully open and amendable code- has been used successfully in the latest years as it has been the producing fully accessible energy models suitable for capacity building and policy making suggestions. This study develops a CLEWs (climate, land, energy and water) model with the objective of interrogating global challenges (e.g. increasing food demand) and international trade features, with policy priorities on food security, resource efficiency, low-carbon energy and climate change mitigation, water availability and vulnerability to water stress and floods, water quality, biodiversity and ecosystem services. It will for instance assess (i) the impact of water constraints on food security and human development (clean water for human use; industrial and energy water demands), as well as (ii) the impact of climate change on aggravating or relieving water problems.

  13. Application of Molecular Modeling to Urokinase Inhibitors Development

    Directory of Open Access Journals (Sweden)

    V. B. Sulimov

    2014-01-01

    Full Text Available Urokinase-type plasminogen activator (uPA plays an important role in the regulation of diverse physiologic and pathologic processes. Experimental research has shown that elevated uPA expression is associated with cancer progression, metastasis, and shortened survival in patients, whereas suppression of proteolytic activity of uPA leads to evident decrease of metastasis. Therefore, uPA has been considered as a promising molecular target for development of anticancer drugs. The present study sets out to develop the new selective uPA inhibitors using computer-aided structural based drug design methods. Investigation involves the following stages: computer modeling of the protein active site, development and validation of computer molecular modeling methods: docking (SOL program, postprocessing (DISCORE program, direct generalized docking (FLM program, and the application of the quantum chemical calculations (MOPAC package, search of uPA inhibitors among molecules from databases of ready-made compounds to find new uPA inhibitors, and design of new chemical structures and their optimization and experimental examination. On the basis of known uPA inhibitors and modeling results, 18 new compounds have been designed, calculated using programs mentioned above, synthesized, and tested in vitro. Eight of them display inhibitory activity and two of them display activity about 10 μM.

  14. Parallel computer processing and modeling: applications for the ICU

    Science.gov (United States)

    Baxter, Grant; Pranger, L. Alex; Draghic, Nicole; Sims, Nathaniel M.; Wiesmann, William P.

    2003-07-01

    Current patient monitoring procedures in hospital intensive care units (ICUs) generate vast quantities of medical data, much of which is considered extemporaneous and not evaluated. Although sophisticated monitors to analyze individual types of patient data are routinely used in the hospital setting, this equipment lacks high order signal analysis tools for detecting long-term trends and correlations between different signals within a patient data set. Without the ability to continuously analyze disjoint sets of patient data, it is difficult to detect slow-forming complications. As a result, the early onset of conditions such as pneumonia or sepsis may not be apparent until the advanced stages. We report here on the development of a distributed software architecture test bed and software medical models to analyze both asynchronous and continuous patient data in real time. Hardware and software has been developed to support a multi-node distributed computer cluster capable of amassing data from multiple patient monitors and projecting near and long-term outcomes based upon the application of physiologic models to the incoming patient data stream. One computer acts as a central coordinating node; additional computers accommodate processing needs. A simple, non-clinical model for sepsis detection was implemented on the system for demonstration purposes. This work shows exceptional promise as a highly effective means to rapidly predict and thereby mitigate the effect of nosocomial infections.

  15. Computational multiscale modeling of fluids and solids theory and applications

    CERN Document Server

    Steinhauser, Martin Oliver

    2017-01-01

    The idea of the book is to provide a comprehensive overview of computational physics methods and techniques, that are used for materials modeling on different length and time scales. Each chapter first provides an overview of the basic physical principles which are the basis for the numerical and mathematical modeling on the respective length-scale. The book includes the micro-scale, the meso-scale and the macro-scale, and the chapters follow this classification. The book explains in detail many tricks of the trade of some of the most important methods and techniques that are used to simulate materials on the perspective levels of spatial and temporal resolution. Case studies are included to further illustrate some methods or theoretical considerations. Example applications for all techniques are provided, some of which are from the author’s own contributions to some of the research areas. The second edition has been expanded by new sections in computational models on meso/macroscopic scales for ocean and a...

  16. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  17. Application of modeling to local chemistry in PWR steam generators

    International Nuclear Information System (INIS)

    Fauchon, C.; Millett, P.J.; Ollar, P.

    1998-01-01

    Localized corrosion of the SG tubes and other components is due to the presence of an aggressive environment in local crevices and occluded regions. In crevices and on vertical and horizontal tube surfaces, corrosion products and particulate matter can accumulate in the form of porous deposits. The SG water contains impurities at extremely low levels (ppb). Low levels of non-volatile impurities, however, can be efficiently concentrated in crevices and sludge piles by a thermal hydraulic mechanism. The temperature gradient across the SG tube coupled with local flow starvation, produces local boiling in the sludge and crevices. Since mass transfer processes are inhibited in these geometries, the residual liquid becomes enriched in many of the species present in the SG water. The resulting concentrated solutions have been shown to be aggressive and can corrode the SG materials. This corrosion may occur under various conditions which result in different types of attack such as pitting, stress corrosion cracking, wastage and denting. A major goal of EPRI's research program has been the development of models of the concentration process and the resulting chemistry. An improved understanding should eventually allow utilities to reduce or eliminate the corrosion by the appropriate manipulation of the steam generator water chemistry and or crevice conditions. The application of these models to experimental data obtained for prototypical SG tube support crevices is described in this paper. The models adequately describe the key features of the experimental data allowing extrapolations to be made to plant conditions. (author)

  18. Using the object modeling system for hydrological model development and application

    Directory of Open Access Journals (Sweden)

    S. Kralisch

    2005-01-01

    Full Text Available State of the art challenges in sustainable management of water resources have created demand for integrated, flexible and easy to use hydrological models which are able to simulate the quantitative and qualitative aspects of the hydrological cycle with a sufficient degree of certainty. Existing models which have been de-veloped to fit these needs are often constrained to specific scales or purposes and thus can not be easily adapted to meet different challenges. As a solution for flexible and modularised model development and application, the Object Modeling System (OMS has been developed in a joint approach by the USDA-ARS, GPSRU (Fort Collins, CO, USA, USGS (Denver, CO, USA, and the FSU (Jena, Germany. The OMS provides a modern modelling framework which allows the implementation of single process components to be compiled and applied as custom tailored model assemblies. This paper describes basic principles of the OMS and its main components and explains in more detail how the problems during coupling of models or model components are solved inside the system. It highlights the integration of different spatial and temporal scales by their representation as spatial modelling entities embedded into time compound components. As an exam-ple the implementation of the hydrological model J2000 is discussed.

  19. Selected developments and applications of Leontief models in industrial ecology

    International Nuclear Information System (INIS)

    Stroemman, Anders Hammer

    2005-01-01

    extended for this study through the application of multi-objective optimization techniques and is used to explore efficient trade offs between reducing CO2 emissions and increasing global factor costs. Concluding Remarks: It has been the scope of this work to contribute to map the interdisciplinary landscape between input-output analysis and industrial ecology. The three first papers enters this landscape from the Industrial Ecology side, more specifically form the Life Cycle Assessment platform and the two latter from the input-output paradigm. The fundamental learning obtained is that the linear section of this landscape is described by Leontief models. Both Life Cycle Assessment, Mass Flow Analysis and Substance Flow Analysis etc. can be represented on the mathematical form proposed by Leontief. The input output framework offers a well- developed set of methodologies that can bridge the various sub-fields of industrial ecology addressing question related to inter-process flows. It seems that an acknowledgement of Leontief models as the base framework for the family of linear models in industrial ecology would be beneficial. Following the acknowledgement of Leontief's work comes that of Dantzig and the development of linear programming. In investigating alternative arrangements of production and combinations of technologies to produce a given good, the common practice in LCA has been total enumeration of all scenarios. This might be feasible, and for that sake desirable, for a limited amount combinations. However as the complexity and number of alternatives increases this will not be feasible. Dantzig invented Linear programming to address exactly this type of problem. The scientific foundation provided by Leontief and Dantzig has been crucial to the work in this thesis. It is my belief that the impact to industrial ecology of their legacy will increase further in the years to come. (Author)

  20. Modelling of gecko foot for future robot application

    Science.gov (United States)

    Kamaruddin, A.; Ong, N. R.; Aziz, M. H. A.; Alcain, J. B.; Haimi, W. M. W. N.; Sauli, Z.

    2017-09-01

    Every gecko has an approximately million microscale hairs called setae which made it easy for them to cling from different surfaces at any orientation with the aid of Van der Waals force as the primary mechanism used to adhere to any contact surfaces. In this paper, a strain simulation using Comsol Multiphysic Software was conducted on a 3D MEMS model of an actuated gecko foot with the aim of achieving optimal sticking with various polymetric materials for future robots application. Based on the stress and strain analyses done on the seven different polymers, it was found that polysilicon had the best result which was nearest to 0%, indicating the strongest elasticity among the others. PDMS on the hand, failed in the simulation due to its bulk-like nature. Thus, PDMS was not suitable to be used for further study on gecko foot robot.

  1. Low Dimensional Semiconductor Structures Characterization, Modeling and Applications

    CERN Document Server

    Horing, Norman

    2013-01-01

    Starting with the first transistor in 1949, the world has experienced a technological revolution which has permeated most aspects of modern life, particularly over the last generation. Yet another such revolution looms up before us with the newly developed capability to control matter on the nanometer scale. A truly extraordinary research effort, by scientists, engineers, technologists of all disciplines, in nations large and small throughout the world, is directed and vigorously pressed to develop a full understanding of the properties of matter at the nanoscale and its possible applications, to bring to fruition the promise of nanostructures to introduce a new generation of electronic and optical devices. The physics of low dimensional semiconductor structures, including heterostructures, superlattices, quantum wells, wires and dots is reviewed and their modeling is discussed in detail. The truly exceptional material, Graphene, is reviewed; its functionalization and Van der Waals interactions are included h...

  2. Mathematical and numerical modeling in porous media applications in geosciences

    CERN Document Server

    Diaz Viera, Martin A; Coronado, Manuel; Ortiz Tapia, Arturo

    2012-01-01

    Porous media are broadly found in nature and their study is of high relevance in our present lives. In geosciences porous media research is fundamental in applications to aquifers, mineral mines, contaminant transport, soil remediation, waste storage, oil recovery and geothermal energy deposits. Despite their importance, there is as yet no complete understanding of the physical processes involved in fluid flow and transport. This fact can be attributed to the complexity of the phenomena which include multicomponent fluids, multiphasic flow and rock-fluid interactions. Since its formulation in 1856, Darcy's law has been generalized to describe multi-phase compressible fluid flow through anisotropic and heterogeneous porous and fractured rocks. Due to the scarcity of information, a high degree of uncertainty on the porous medium properties is commonly present. Contributions to the knowledge of modeling flow and transport, as well as to the characterization of porous media at field scale are of great relevance. ...

  3. Urban design and modeling: applications and perspectives on GIS

    Directory of Open Access Journals (Sweden)

    Roberto Mingucci

    2013-05-01

    Full Text Available In recent years, GIS systems have evolved because of technological advancements that make possible the simultaneous management of multiple amount of information.Interesting aspects in their application concern the site documentation at the territorial scale taking advantage of CAD/BIM systems, usually working at the building scale instead.In this sense, the survey using sophisticated equipment such as laser scanners or UAV drones quickly captures data that can be enjoyed across even through new “mobile” technologies, operating in the web-based information systems context. This paper aims to investigate use and perspectives pertaining to geographic information technologies, analysis and design tools meant for modeling at different scales, referring to results of research experiences conducted at the University of Bologna.

  4. High speed railway track dynamics models, algorithms and applications

    CERN Document Server

    Lei, Xiaoyan

    2017-01-01

    This book systematically summarizes the latest research findings on high-speed railway track dynamics, made by the author and his research team over the past decade. It explores cutting-edge issues concerning the basic theory of high-speed railways, covering the dynamic theories, models, algorithms and engineering applications of the high-speed train and track coupling system. Presenting original concepts, systematic theories and advanced algorithms, the book places great emphasis on the precision and completeness of its content. The chapters are interrelated yet largely self-contained, allowing readers to either read through the book as a whole or focus on specific topics. It also combines theories with practice to effectively introduce readers to the latest research findings and developments in high-speed railway track dynamics. It offers a valuable resource for researchers, postgraduates and engineers in the fields of civil engineering, transportation, highway & railway engineering.

  5. The Application of Adaptive Behaviour Models: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Jessica A. Price

    2018-01-01

    Full Text Available Adaptive behaviour has been viewed broadly as an individual’s ability to meet the standards of social responsibilities and independence; however, this definition has been a source of debate amongst researchers and clinicians. Based on the rich history and the importance of the construct of adaptive behaviour, the current study aimed to provide a comprehensive overview of the application of adaptive behaviour models to assessment tools, through a systematic review. A plethora of assessment measures for adaptive behaviour have been developed in order to adequately assess the construct; however, it appears that the only definition on which authors seem to agree is that adaptive behaviour is what adaptive behaviour scales measure. The importance of the construct for diagnosis, intervention and planning has been highlighted throughout the literature. It is recommended that researchers and clinicians critically review what measures of adaptive behaviour they are utilising and it is suggested that the definition and theory is revisited.

  6. Heuristics for Hierarchical Partitioning with Application to Model Checking

    DEFF Research Database (Denmark)

    Möller, Michael Oliver; Alur, Rajeev

    2001-01-01

    Given a collection of connected components, it is often desired to cluster together parts of strong correspondence, yielding a hierarchical structure. We address the automation of this process and apply heuristics to battle the combinatorial and computational complexity. We define a cost function...... studies. A longer version of this paper is available as technical report BRICS Research Series RS-00-21. Basic Research in Computer Science, Center of the Danish National Research Foundation....... function. We argue for a heuristic function based on four criteria: the number of enclosed connections, the number of components, the number of touched connections and the depth of the structure. We report on an application in the context of formal verification, where our algorithm serves as a preprocessor...... for a temporal scaling technique, called “Next” heuristic [2]. The latter is applicable in reachability analysis and is included in a recent version of the Mocha model checking tool. We demonstrate performance and benefits of our method and use an asynchronous parity computer and an opinion poll protocol as case...

  7. Application of the evolution theory in modelling of innovation diffusion

    Directory of Open Access Journals (Sweden)

    Krstić Milan

    2016-01-01

    Full Text Available The theory of evolution has found numerous analogies and applications in other scientific disciplines apart from biology. In that sense, today the so-called 'memetic-evolution' has been widely accepted. Memes represent a complex adaptable system, where one 'meme' represents an evolutional cultural element, i.e. the smallest unit of information which can be identified and used in order to explain the evolution process. Among others, the field of innovations has proved itself to be a suitable area where the theory of evolution can also be successfully applied. In this work the authors have started from the assumption that it is also possible to apply the theory of evolution in the modelling of the process of innovation diffusion. Based on the conducted theoretical research, the authors conclude that the process of innovation diffusion in the interpretation of a 'meme' is actually the process of imitation of the 'meme' of innovation. Since during the process of their replication certain 'memes' show a bigger success compared to others, that eventually leads to their natural selection. For the survival of innovation 'memes', their manifestations are of key importance in the sense of their longevity, fruitfulness and faithful replicating. The results of the conducted research have categorically confirmed the assumption of the possibility of application of the evolution theory with the innovation diffusion with the help of innovation 'memes', which opens up the perspectives for some new researches on the subject.

  8. Modelling and Designing Cryogenic Hydrogen Tanks for Future Aircraft Applications

    Directory of Open Access Journals (Sweden)

    Christopher Winnefeld

    2018-01-01

    Full Text Available In the near future, the challenges to reduce the economic and social dependency on fossil fuels must be faced increasingly. A sustainable and efficient energy supply based on renewable energies enables large-scale applications of electro-fuels for, e.g., the transport sector. The high gravimetric energy density makes liquefied hydrogen a reasonable candidate for energy storage in a light-weight application, such as aviation. Current aircraft structures are designed to accommodate jet fuel and gas turbines allowing a limited retrofitting only. New designs, such as the blended-wing-body, enable a more flexible integration of new storage technologies and energy converters, e.g., cryogenic hydrogen tanks and fuel cells. Against this background, a tank-design model is formulated, which considers geometrical, mechanical and thermal aspects, as well as specific mission profiles while considering a power supply by a fuel cell. This design approach enables the determination of required tank mass and storage density, respectively. A new evaluation value is defined including the vented hydrogen mass throughout the flight enabling more transparent insights on mass shares. Subsequently, a systematic approach in tank partitioning leads to associated compromises regarding the tank weight. The analysis shows that cryogenic hydrogen tanks are highly competitive with kerosene tanks in terms of overall mass, which is further improved by the use of a fuel cell.

  9. Potential biodefense model applications for portable chlorine dioxide gas production.

    Science.gov (United States)

    Stubblefield, Jeannie M; Newsome, Anthony L

    2015-01-01

    Development of decontamination methods and strategies to address potential infectious disease outbreaks and bioterrorism events are pertinent to this nation's biodefense strategies and general biosecurity. Chlorine dioxide (ClO2) gas has a history of use as a decontamination agent in response to an act of bioterrorism. However, the more widespread use of ClO2 gas to meet current and unforeseen decontamination needs has been hampered because the gas is too unstable for shipment and must be prepared at the application site. Newer technology allows for easy, onsite gas generation without the need for dedicated equipment, electricity, water, or personnel with advanced training. In a laboratory model system, 2 unique applications (personal protective equipment [PPE] and animal skin) were investigated in the context of potential development of decontamination protocols. Such protocols could serve to reduce human exposure to bacteria in a decontamination response effort. Chlorine dioxide gas was capable of reducing (2-7 logs of vegetative and spore-forming bacteria), and in some instances eliminating, culturable bacteria from difficult to clean areas on PPE facepieces. The gas was effective in eliminating naturally occurring bacteria on animal skin and also on skin inoculated with Bacillus spores. The culturable bacteria, including Bacillus spores, were eliminated in a time- and dose-dependent manner. Results of these studies suggested portable, easily used ClO2 gas generation systems have excellent potential for protocol development to contribute to biodefense strategies and decontamination responses to infectious disease outbreaks or other biothreat events.

  10. Semantic Technologies for Nuclear Knowledge Modelling and Applications

    International Nuclear Information System (INIS)

    Beraha, D.; Gladyshev, M.

    2016-01-01

    Full text: The IAEA has been engaged in working with Member States to preserve and enhance nuclear knowledge, and in supporting wide dissemination of safety related technical and technological information enhancing nuclear safety. The knowledge organization systems (ontologies, taxonomies, thesauri, etc.) provide one of the means to model and structure a given knowledge domain. The significance of knowledge organization systems (KOS) has been greatly enhanced by the evolution of the semantic technologies, enabling machines to “understand” the concepts described in a KOS, and to use them in a variety of applications. Over recent years semantic technologies have emerged as efficient means to improve access to information and knowledge. The Semantic Web Standards play an important role in creating an infrastructure of interoperable data sources based on principles of Linked Data. The status of utilizing semantic technologies in the nuclear domain is shortly reviewed, noting that such technologies are in their early stage of adoption, and considering some aspects which are specific to nuclear knowledge management. Several areas are described where semantic technologies are already deployed, and other areas are indicated where applications based on semantic technologies will have a strong impact on nuclear knowledge management in the near future. (author

  11. Parametric Model for Astrophysical Proton-Proton Interactions and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Niklas [KTH Royal Institute of Technology, Stockholm (Sweden)

    2007-01-01

    Observations of gamma-rays have been made from celestial sources such as active galaxies, gamma-ray bursts and supernova remnants as well as the Galactic ridge. The study of gamma rays can provide information about production mechanisms and cosmic-ray acceleration. In the high-energy regime, one of the dominant mechanisms for gamma-ray production is the decay of neutral pions produced in interactions of ultra-relativistic cosmic-ray nuclei and interstellar matter. Presented here is a parametric model for calculations of inclusive cross sections and transverse momentum distributions for secondary particles--gamma rays, e±, ve, $\\bar{v}$e, vμ and $\\bar{μ}$e--produced in proton-proton interactions. This parametric model is derived on the proton-proton interaction model proposed by Kamae et al.; it includes the diffraction dissociation process, Feynman-scaling violation and the logarithmically rising inelastic proton-proton cross section. To improve fidelity to experimental data for lower energies, two baryon resonance excitation processes were added; one representing the Δ(1232) and the other multiple resonances with masses around 1600 MeV/c2. The model predicts the power-law spectral index for all secondary particle to be about 0.05 lower in absolute value than that of the incident proton and their inclusive cross sections to be larger than those predicted by previous models based on the Feynman-scaling hypothesis. The applications of the presented model in astrophysics are plentiful. It has been implemented into the Galprop code to calculate the contribution due to pion decays in the Galactic plane. The model has also been used to estimate the cosmic-ray flux in the Large Magellanic Cloud based on HI, CO and gamma-ray observations. The transverse momentum distributions enable calculations when the proton distribution is anisotropic. It is shown that the gamma-ray spectrum and flux due to a

  12. Application and improvement of Raupach's shear stress partitioning model

    Science.gov (United States)

    Walter, B. A.; Lehning, M.; Gromke, C.

    2012-12-01

    Aeolian processes such as the entrainment, transport and redeposition of sand, soil or snow are able to significantly reshape the earth's surface. In times of increasing desertification and land degradation, often driven by wind erosion, investigations of aeolian processes become more and more important in environmental sciences. The reliable prediction of the sheltering effect of vegetation canopies against sediment erosion, for instance, is a clear practical application of such investigations to identify suitable and sustainable counteractive measures against wind erosion. This study presents an application and improvement of a theoretical model presented by Raupach (Boundary-Layer Meteorology, 1992, Vol.60, 375-395 and Journal of Geophysical Research, 1993, Vol.98, 3023-3029) which allows for quantifying the sheltering effect of vegetation against sediment erosion. The model predicts the shear stress ratios τS'/τ and τS''/τ. Here, τS is the part of the total shear stress τ that acts on the ground beneath the plants. The spatial peak τS'' of the surface shear stress is responsible for the onset of particle entrainment whereas the spatial mean τS' can be used to quantify particle mass fluxes. The precise and accurate prediction of these quantities is essential when modeling wind erosion. Measurements of the surface shear stress distributions τS(x,y) on the ground beneath live vegetation canopies (plant species: lolium perenne) were performed in a controlled wind tunnel environment to determine the model parameters and to evaluate the model performance. Rigid, non-porous wooden blocks instead of the plants were additionally tested for the purpose of comparison, since previous wind tunnel studies used exclusively artificial plant imitations for their experiments on shear stress partitioning. The model constant c, which is needed to determine the total stress τ for a canopy of interest and which remained rather unspecified to date, was found to be c ≈ 0

  13. Modeling Markov Switching ARMA-GARCH Neural Networks Models and an Application to Forecasting Stock Returns

    Directory of Open Access Journals (Sweden)

    Melike Bildirici

    2014-01-01

    Full Text Available The study has two aims. The first aim is to propose a family of nonlinear GARCH models that incorporate fractional integration and asymmetric power properties to MS-GARCH processes. The second purpose of the study is to augment the MS-GARCH type models with artificial neural networks to benefit from the universal approximation properties to achieve improved forecasting accuracy. Therefore, the proposed Markov-switching MS-ARMA-FIGARCH, APGARCH, and FIAPGARCH processes are further augmented with MLP, Recurrent NN, and Hybrid NN type neural networks. The MS-ARMA-GARCH family and MS-ARMA-GARCH-NN family are utilized for modeling the daily stock returns in an emerging market, the Istanbul Stock Index (ISE100. Forecast accuracy is evaluated in terms of MAE, MSE, and RMSE error criteria and Diebold-Mariano equal forecast accuracy tests. The results suggest that the fractionally integrated and asymmetric power counterparts of Gray’s MS-GARCH model provided promising results, while the best results are obtained for their neural network based counterparts. Further, among the models analyzed, the models based on the Hybrid-MLP and Recurrent-NN, the MS-ARMA-FIAPGARCH-HybridMLP, and MS-ARMA-FIAPGARCH-RNN provided the best forecast performances over the baseline single regime GARCH models and further, over the Gray’s MS-GARCH model. Therefore, the models are promising for various economic applications.

  14. Animal models of osteogenesis imperfecta: applications in clinical research

    Directory of Open Access Journals (Sweden)

    Enderli TA

    2016-09-01

    Full Text Available Tanya A Enderli, Stephanie R Burtch, Jara N Templet, Alessandra Carriero Department of Biomedical Engineering, Florida Institute of Technology, Melbourne, FL, USA Abstract: Osteogenesis imperfecta (OI, commonly known as brittle bone disease, is a genetic disease characterized by extreme bone fragility and consequent skeletal deformities. This connective tissue disorder is caused by mutations in the quality and quantity of the collagen that in turn affect the overall mechanical integrity of the bone, increasing its vulnerability to fracture. Animal models of the disease have played a critical role in the understanding of the pathology and causes of OI and in the investigation of a broad range of clinical therapies for the disease. Currently, at least 20 animal models have been officially recognized to represent the phenotype and biochemistry of the 17 different types of OI in humans. These include mice, dogs, and fish. Here, we describe each of the animal models and the type of OI they represent, and present their application in clinical research for treatments of OI, such as drug therapies (ie, bisphosphonates and sclerostin and mechanical (ie, vibrational loading. In the future, different dosages and lengths of treatment need to be further investigated on different animal models of OI using potentially promising treatments, such as cellular and chaperone therapies. A combination of therapies may also offer a viable treatment regime to improve bone quality and reduce fragility in animals before being introduced into clinical trials for OI patients. Keywords: OI, brittle bone, clinical research, mouse, dog, zebrafish

  15. X-ray ablation measurements and modeling for ICF applications

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Andrew Thomas [Univ. of California, Berkeley, CA (United States)

    1996-09-01

    X-ray ablation of material from the first wall and other components of an ICF (Inertial Confinement Fusion) chamber is a major threat to the laser final optics. Material condensing on these optics after a shot may cause damage with subsequent laser shots. To ensure the successful operation of the ICF facility, removal rates must be predicted accurately. The goal for this dissertation is to develop an experimentally validated x-ray response model, with particular application to the National Ignition Facility (NIF). Accurate knowledge of the x-ray and debris emissions from ICF targets is a critical first step in the process of predicting the performance of the target chamber system. A number of 1-D numerical simulations of NIF targets have been run to characterize target output in terms of energy, angular distribution, spectrum, and pulse shape. Scaling of output characteristics with variations of both target yield and hohlraum wall thickness are also described. Experiments have been conducted at the Nova laser on the effects of relevant x-ray fluences on various materials. The response was diagnosed using post-shot examinations of the surfaces with scanning electron microscope and atomic force microscope instruments. Judgments were made about the dominant removal mechanisms for each material. Measurements of removal depths were made to provide data for the modeling. The finite difference ablation code developed here (ABLATOR) combines the thermomechanical response of materials to x-rays with models of various removal mechanisms. The former aspect refers to energy deposition in such small characteristic depths (~ micron) that thermal conduction and hydrodynamic motion are significant effects on the nanosecond time scale. The material removal models use the resulting time histories of temperature and pressure-profiles, along with ancillary local conditions, to predict rates of surface vaporization and the onset of conditions that would lead to spallation.

  16. Application of Physically based landslide susceptibility models in Brazil

    Science.gov (United States)

    Carvalho Vieira, Bianca; Martins, Tiago D.

    2017-04-01

    Shallow landslides and floods are the processes responsible for most material and environmental damages in Brazil. In the last decades, some landslides events induce a high number of deaths (e.g. Over 1000 deaths in one event) and incalculable social and economic losses. Therefore, the prediction of those processes is considered an important tool for land use planning tools. Among different methods the physically based landslide susceptibility models having been widely used in many countries, but in Brazil it is still incipient when compared to other ones, like statistical tools and frequency analyses. Thus, the main objective of this research was to assess the application of some Physically based landslide susceptibility models in Brazil, identifying their main results, the efficiency of susceptibility mapping, parameters used and limitations of the tropical humid environment. In order to achieve that, it was evaluated SHALSTAB, SINMAP and TRIGRS models in some studies in Brazil along with the Geotechnical values, scales, DEM grid resolution and the results based on the analysis of the agreement between predicted susceptibility and the landslide scar's map. Most of the studies in Brazil applied SHALSTAB, SINMAP and to a lesser extent the TRIGRS model. The majority researches are concentrated in the Serra do Mar mountain range, that is a system of escarpments and rugged mountains that extends more than 1,500 km along the southern and southeastern Brazilian coast, and regularly affected by heavy rainfall that generates widespread mass movements. Most part of these studies used conventional topographic maps with scales ranging from 1:2000 to 1:50000 and DEM-grid resolution between 2 and 20m. Regarding the Geotechnical and hydrological values, a few studies use field collected data which could produce more efficient results, as indicated by international literature. Therefore, even though they have enormous potential in the susceptibility mapping, even for comparison

  17. Sky-Radiance Models for Monte Carlo Radiative Transfer Applications

    Science.gov (United States)

    Santos, I.; Dalimonte, D.; Santos, J. P.

    2012-04-01

    differences was afterwards investigated by analyzing how these models vary the sun and sky photon fraction in MC simulations that use the diffuse-to-total irradiance ratio. In this case, differences up to 14% have been found for λ=665 nm and θ*=60°. The study recommendation is then using Lsky models that, like the ZV expression, account for the wavelength dependence of light interaction with atmospheric particles and molecule when initializing MC simulations for ocean color applications, mostly in the case of analyses including the blue region of the visible spectra. Dr. Giuseppe Zibordi, Prof. Pedro Vieira and Tamito Kajiyama are duly acknowledged for valuable discussions. This study has been partiallysupported by ESA under contract n. 12595/09/I-OL with FCT/UNL, Portugal.

  18. Modeling of photochemical pollution in Athens, Greece. Application of the RAMS-CALGRID modeling system

    Science.gov (United States)

    Pilinis, Christodoulos; Kassomenos, Pavlos; Kallos, George

    The causes of the poor air quality in Athens, Greece during the severe episode of 25-26 May 1990 has been studied, using a prognostic model (RAMS) and a three-dimensional Eulerian air quality model (CALGRID). The modeling effort indicates that the main urban area of Athens exhibited high concentrations of nitrogen oxides, the main sources of which are automobiles, while the NNE suburban area exhibited high ozone concentrations, the product of photochemical activity of the primary pollutants that were transported by the sea-breeze. The application of the models also demonstrated the need for an accurate emission inventory for improved predictions of the pollutant concentrations. It was also found that a 50% reduction of the nitrogen oxide emissions will increase the ozone levels in the downtown area substantially.

  19. Focuss algorithm application in kinetic compartment modeling for PET tracer

    International Nuclear Information System (INIS)

    Huang Xinrui; Bao Shanglian

    2004-01-01

    Molecular imaging is in the process of becoming. Its application mostly depends on the molecular discovery process of imaging probes and drugs, from the mouse to the patient, from research to clinical practice. Positron emission tomography (PET) can non-invasively monitor . pharmacokinetic and functional processes of drugs in intact organisms at tracer concentrations by kinetic modeling. It has been known that for all biological systems, linear or nonlinear, if the system is injected by a tracer in a steady state, the distribution of the tracer follows the kinetics of a linear compartmental system, which has sums of exponential solutions. Based on the general compartmental description of the tracer's fate in vivo, we presented a novel kinetic modeling approach for the quantification of in vivo tracer studies with dynamic positron emission tomography (PET), which can determine a parsimonious model consisting with the measured data. This kinetic modeling technique allows for estimation of parametric images from a voxel based analysis and requires no a priori decision about the tracer's fate in vivo, instead determining the most appropriate model from the information contained within the kinetic data. Choosing a set of exponential functions, convolved with the plasma input function, as basis functions, the time activity curve of a region or a pixel can be written as a linear combination of the basis functions with corresponding coefficients. The number of non-zero coefficients returned corresponds to the model order which is related to the number of tissue compartments. The system macro parameters are simply determined using the focal underdetermined system solver (FOCUSS) algorithm. The FOCUSS algorithm is a nonparametric algorithm for finding localized energy solutions from limited data and is a recursive linear estimation procedure. FOCUSS algorithm usually converges very fast, so demands a few iterations. The effectiveness is verified by simulation and clinical

  20. 3-dimensional modeling of transcranial magnetic stimulation: Design and application

    Science.gov (United States)

    Salinas, Felipe Santiago

    Over the past three decades, transcranial magnetic stimulation (TMS) has emerged as an effective tool for many research, diagnostic and therapeutic applications in humans. TMS delivers highly localized brain stimulations via non-invasive externally applied magnetic fields. This non-invasive, painless technique provides researchers and clinicians a unique tool capable of stimulating both the central and peripheral nervous systems. However, a complete analysis of the macroscopic electric fields produced by TMS has not yet been performed. In this dissertation, we present a thorough examination of the total electric field induced by TMS in air and a realistic head model with clinically relevant coil poses. In the first chapter, a detailed account of TMS coil wiring geometry was shown to provide significant improvements in the accuracy of primary E-field calculations. Three-dimensional models which accounted for the TMS coil's wire width, height, shape and number of turns clearly improved the fit of calculated-to-measured E-fields near the coil body. Detailed primary E-field models were accurate up to the surface of the coil body (within 0.5% of measured values) whereas simple models were often inadequate (up to 32% different from measured). In the second chapter, we addressed the importance of the secondary E-field created by surface charge accumulation during TMS using the boundary element method (BEM). 3-D models were developed using simple head geometries in order to test the model and compare it with measured values. The effects of tissue geometry, size and conductivity were also investigated. Finally, a realistic head model was used to assess the effect of multiple surfaces on the total E-field. We found that secondary E-fields have the greatest impact at areas in close proximity to each tissue layer. Throughout the head, the secondary E-field magnitudes were predominantly between 25% and 45% of the primary E-fields magnitude. The direction of the secondary E

  1. Application of Stochastic Partial Differential Equations to Reservoir Property Modelling

    KAUST Repository

    Potsepaev, R.

    2010-09-06

    Existing algorithms of geostatistics for stochastic modelling of reservoir parameters require a mapping (the \\'uvt-transform\\') into the parametric space and reconstruction of a stratigraphic co-ordinate system. The parametric space can be considered to represent a pre-deformed and pre-faulted depositional environment. Existing approximations of this mapping in many cases cause significant distortions to the correlation distances. In this work we propose a coordinate free approach for modelling stochastic textures through the application of stochastic partial differential equations. By avoiding the construction of a uvt-transform and stratigraphic coordinates, one can generate realizations directly in the physical space in the presence of deformations and faults. In particular the solution of the modified Helmholtz equation driven by Gaussian white noise is a zero mean Gaussian stationary random field with exponential correlation function (in 3-D). This equation can be used to generate realizations in parametric space. In order to sample in physical space we introduce a stochastic elliptic PDE with tensor coefficients, where the tensor is related to correlation anisotropy and its variation is physical space.

  2. A general method for modeling population dynamics and its applications.

    Science.gov (United States)

    Shestopaloff, Yuri K

    2013-12-01

    Studying populations, be it a microbe colony or mankind, is important for understanding how complex systems evolve and exist. Such knowledge also often provides insights into evolution, history and different aspects of human life. By and large, populations' prosperity and decline is about transformation of certain resources into quantity and other characteristics of populations through growth, replication, expansion and acquisition of resources. We introduce a general model of population change, applicable to different types of populations, which interconnects numerous factors influencing population dynamics, such as nutrient influx and nutrient consumption, reproduction period, reproduction rate, etc. It is also possible to take into account specific growth features of individual organisms. We considered two recently discovered distinct growth scenarios: first, when organisms do not change their grown mass regardless of nutrients availability, and the second when organisms can reduce their grown mass by several times in a nutritionally poor environment. We found that nutrient supply and reproduction period are two major factors influencing the shape of population growth curves. There is also a difference in population dynamics between these two groups. Organisms belonging to the second group are significantly more adaptive to reduction of nutrients and far more resistant to extinction. Also, such organisms have substantially more frequent and lesser in amplitude fluctuations of population quantity for the same periodic nutrient supply (compared to the first group). Proposed model allows adequately describing virtually any possible growth scenario, including complex ones with periodic and irregular nutrient supply and other changing parameters, which present approaches cannot do.

  3. DFT application for chlorin derivatives photosensitizer drugs modeling

    Science.gov (United States)

    Machado, Neila; Carvalho, B. G.; Téllez Soto, C. A.; Martin, A. A.; Favero, P. P.

    2018-04-01

    Photodynamic therapy is an alternative form of cancer treatment that meets the desire for a less aggressive approach to the body. It is based on the interaction between a photosensitizer, activating light, and molecular oxygen. This interaction results in a cascade of reactions that leads to localized cell death. Many studies have been conducted to discover an ideal photosensitizer, which aggregates all the desirable characteristics of a potent cell killer and generates minimal side effects. Using Density Functional Theory (DFT) implemented in the program Vienna Ab-initio Simulation Package, new chlorin derivatives with different functional groups were simulated to evaluate the different absorption wavelengths to permit resonant absorption with the incident laser. Gaussian 09 program was used to determine vibrational wave numbers and Natural Bond Orbitals. The chosen drug with the best characteristics for the photosensitizer was a modified model of the original chlorin, which was called as Thiol chlorin. According to our calculations it is stable and is 19.6% more efficient at optical absorption in 708 nm in comparison to the conventional chlorin e6. Vibrational modes, optical and electronic properties were predicted. In conclusion, this study is an attempt to improve the development of new photosensitizer drugs through computational methods that save time and contribute to decrease the numbers of animals for model application.

  4. Application of a theoretical model to evaluate COPD disease management

    Directory of Open Access Journals (Sweden)

    Asin Javier D

    2010-03-01

    Full Text Available Abstract Background Disease management programmes are heterogeneous in nature and often lack a theoretical basis. An evaluation model has been developed in which theoretically driven inquiries link disease management interventions to outcomes. The aim of this study is to methodically evaluate the impact of a disease management programme for patients with chronic obstructive pulmonary disease (COPD on process, intermediate and final outcomes of care in a general practice setting. Methods A quasi-experimental research was performed with 12-months follow-up of 189 COPD patients in primary care in the Netherlands. The programme included patient education, protocolised assessment and treatment of COPD, structural follow-up and coordination by practice nurses at 3, 6 and 12 months. Data on intermediate outcomes (knowledge, psychosocial mediators, self-efficacy and behaviour and final outcomes (dyspnoea, quality of life, measured by the CRQ and CCQ, and patient experiences were obtained from questionnaires and electronic registries. Results Implementation of the programme was associated with significant improvements in dyspnoea (p Conclusions The application of a theory-driven model enhances the design and evaluation of disease management programmes aimed at improving health outcomes. This study supports the notion that a theoretical approach strengthens the evaluation designs of complex interventions. Moreover, it provides prudent evidence that the implementation of COPD disease management programmes can positively influence outcomes of care.

  5. Permeability of Two Parachute Fabrics - Measurements, Modeling, and Application

    Science.gov (United States)

    Cruz, Juan R.; O'Farrell, Clara; Hennings, Elsa; Runnells, Paul

    2016-01-01

    Two parachute fabrics, described by Parachute Industry Specifications PIA-C-7020D Type I and PIA-C-44378D Type I, were tested to obtain their permeabilities in air (i.e., flow-through volume of air per area per time) over the range of differential pressures from 0.146 psf (7 Pa) to 25 psf (1197 Pa). Both fabrics met their specification permeabilities at the standard differential pressure of 0.5 inch of water (2.60 psf, 124 Pa). The permeability results were transformed into an effective porosity for use in calculations related to parachutes. Models were created that related the effective porosity to the unit Reynolds number for each of the fabrics. As an application example, these models were used to calculate the total porosities for two geometrically-equivalent subscale Disk-Gap-Band (DGB) parachutes fabricated from each of the two fabrics, and tested at the same operating conditions in a wind tunnel. Using the calculated total porosities and the results of the wind tunnel tests, the drag coefficient of a geometrically-equivalent full-scale DGB operating on Mars was estimated.

  6. Modeling survival: application of the Andersen-Gill model to Yellowstone grizzly bears

    Science.gov (United States)

    Johnson, Christopher J.; Boyce, Mark S.; Schwartz, Charles C.; Haroldson, Mark A.

    2004-01-01

     Wildlife ecologists often use the Kaplan-Meier procedure or Cox proportional hazards model to estimate survival rates, distributions, and magnitude of risk factors. The Andersen-Gill formulation (A-G) of the Cox proportional hazards model has seen limited application to mark-resight data but has a number of advantages, including the ability to accommodate left-censored data, time-varying covariates, multiple events, and discontinuous intervals of risks. We introduce the A-G model including structure of data, interpretation of results, and assessment of assumptions. We then apply the model to 22 years of radiotelemetry data for grizzly bears (Ursus arctos) of the Greater Yellowstone Grizzly Bear Recovery Zone in Montana, Idaho, and Wyoming, USA. We used Akaike's Information Criterion (AICc) and multi-model inference to assess a number of potentially useful predictive models relative to explanatory covariates for demography, human disturbance, and habitat. Using the most parsimonious models, we generated risk ratios, hypothetical survival curves, and a map of the spatial distribution of high-risk areas across the recovery zone. Our results were in agreement with past studies of mortality factors for Yellowstone grizzly bears. Holding other covariates constant, mortality was highest for bears that were subjected to repeated management actions and inhabited areas with high road densities outside Yellowstone National Park. Hazard models developed with covariates descriptive of foraging habitats were not the most parsimonious, but they suggested that high-elevation areas offered lower risks of mortality when compared to agricultural areas.

  7. APPLICABILITY OF SIMILARITY CONDITIONS TO ANALOGUE MODELLING OF TECTONIC STRUCTURES

    Directory of Open Access Journals (Sweden)

    Mikhail A. Goncharov

    2010-01-01

    Full Text Available The publication is aimed at comparing concepts of V.V. Belousov and M.V. Gzovsky, outstanding researchers who established fundamentals of tectonophysics in Russia, specifically similarity conditions in application to tectonophysical modeling. Quotations from their publications illustrate differences in their views. In this respect, we can reckon V.V. Belousov as a «realist» as he supported «the liberal point of view» [Methods of modelling…, 1988, p. 21–22], whereas M.V. Gzovsky can be regarded as an «idealist» as he believed that similarity conditions should be mandatorily applied to ensure correctness of physical modeling of tectonic deformations and structures [Gzovsky, 1975, pp. 88 and 94].Objectives of the present publication are (1 to be another reminder about desirability of compliance with similarity conditions in experimental tectonics; (2 to point out difficulties in ensuring such compliance; (3 to give examples which bring out the fact that similarity conditions are often met per se, i.e. automatically observed; (4 to show that modeling can be simplified in some cases without compromising quantitative estimations of parameters of structure formation.(1 Physical modelling of tectonic deformations and structures should be conducted, if possible, in compliance with conditions of geometric and physical similarity between experimental models and corresponding natural objects. In any case, a researcher should have a clear vision of conditions applicable to each particular experiment.(2 Application of similarity conditions is often challenging due to unavoidable difficulties caused by the following: a Imperfection of experimental equipment and technologies (Fig. 1 to 3; b uncertainties in estimating parameters of formation of natural structures, including main ones: structure size (Fig. 4, time of formation (Fig. 5, deformation properties of the medium wherein such structures are formed, including, first of all, viscosity (Fig. 6

  8. The "Biopsychosocial Model": 40 years of application in Psychiatry.

    Science.gov (United States)

    Papadimitriou, G

    2017-01-01

    ofonset of an illness's manifestation, and they can also protect a vulnerable person from the disease. Stressful experiences modify immunological response and influence treatment compliance. Non adherence to pharmacotherapy,as well as to the psychosocial interventions, may cause defective recovery of psychosocial functioning, recurrence ofthe disorder, as well as insufficient use of health resources and a higher health care cost. The psychoeducation of patients andtheir relatives by the application of the biopsychosocial model plays an important role in psychiatric therapeutics, and it mayalso be used via Internet in the frame of telepsychiatry. Results from neuroimaging studies have shown that the different kinds of human experiences, traumatic or therapeutic, havemeasurable influences on the brain function. Psychotherapy may modify the neuronal connections of the brain in the frame ofits plasticity, as was found by the discovery of synaptogenesis in response to learning and can, thus, be considered not only as astrictly psychological but also as a biopsychosocial form of treatment. Among the disadvantages of the biopsychosocial model have been reported the lack of a concise theoretical frameworkregarding its function and content, that it is complicated, difficulties in its coordination and assignment of responsibilities, aswell as problems with the education on it being multifaceted. The biopsychosocial model has been criticized that it does notconstitute a scientific or philosophical model, it does not provide an answer to the crucial question of how the biological, psychologicaland social variables interact in the disease's expression, that it does not provide guidance on the exact time of itsapplication and, finally, that it allows for a wide range of interventions without providing specific guidelines of a concrete therapeutic scheme. The person-centered diagnosis is based on the biopsychosocial model, connects science with humanism and uses all thepossible ways so

  9. An Object-Oriented Information Model for Policy-based Management of Distributed Applications

    NARCIS (Netherlands)

    Diaz, G.; Gay, V.C.J.; Horlait, E.; Hamza, M.H.

    2002-01-01

    This paper presents an object-oriented information model to support a policy-based management for distributed multimedia applications. The information base contains application-level information about the users, the applications, and their profile. Our Information model is described in details and

  10. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  11. Application of a free parameter model to plastic scintillation samples

    Energy Technology Data Exchange (ETDEWEB)

    Tarancon Sanz, Alex, E-mail: alex.tarancon@ub.edu [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Kossert, Karsten, E-mail: Karsten.Kossert@ptb.de [Physikalisch-Technische Bundesanstalt (PTB), Bundesallee 100, 38116 Braunschweig (Germany)

    2011-08-21

    In liquid scintillation (LS) counting, the CIEMAT/NIST efficiency tracing method and the triple-to-double coincidence ratio (TDCR) method have proved their worth for reliable activity measurements of a number of radionuclides. In this paper, an extended approach to apply a free-parameter model to samples containing a mixture of solid plastic scintillation microspheres and radioactive aqueous solutions is presented. Several beta-emitting radionuclides were measured in a TDCR system at PTB. For the application of the free parameter model, the energy loss in the aqueous phase must be taken into account, since this portion of the particle energy does not contribute to the creation of scintillation light. The energy deposit in the aqueous phase is determined by means of Monte Carlo calculations applying the PENELOPE software package. To this end, great efforts were made to model the geometry of the samples. Finally, a new geometry parameter was defined, which was determined by means of a tracer radionuclide with known activity. This makes the analysis of experimental TDCR data of other radionuclides possible. The deviations between the determined activity concentrations and reference values were found to be lower than 3%. The outcome of this research work is also important for a better understanding of liquid scintillation counting. In particular the influence of (inverse) micelles, i.e. the aqueous spaces embedded in the organic scintillation cocktail, can be investigated. The new approach makes clear that it is important to take the energy loss in the aqueous phase into account. In particular for radionuclides emitting low-energy electrons (e.g. M-Auger electrons from {sup 125}I), this effect can be very important.

  12. A GUIDED SWAT MODEL APPLICATION ON SEDIMENT YIELD MODELING IN PANGANI RIVER BASIN: LESSONS LEARNT

    Directory of Open Access Journals (Sweden)

    Preksedis Marco Ndomba

    2008-12-01

    Full Text Available The overall objective of this paper is to report on the lessons learnt from applying Soil and Water Assessment Tool (SWAT in a well guided sediment yield modelling study. The study area is the upstream of Pangani River Basin (PRB, the Nyumba Ya Mungu (NYM reservoir catchment, located in the North Eastern part of Tanzania. It should be noted that, previous modeling exercises in the region applied SWAT with preassumption that inter-rill or sheet erosion was the dominant erosion type. In contrast, in this study SWAT model application was guided by results of analysis of high temporal resolution of sediment flow data and hydro-meteorological data. The runoff component of the SWAT model was calibrated from six-years (i.e. 1977–1982 of historical daily streamflow data. The sediment component of the model was calibrated using one-year (1977–1988 daily sediment loads estimated from one hydrological year sampling programme (between March and November, 2005 rating curve. A long-term period over 37 years (i.e. 1969–2005 simulation results of the SWAT model was validated to downstream NYM reservoir sediment accumulation information. The SWAT model captured 56 percent of the variance (CE and underestimated the observed daily sediment loads by 0.9 percent according to Total Mass Control (TMC performance indices during a normal wet hydrological year, i.e., between November 1, 1977 and October 31, 1978, as the calibration period. SWAT model predicted satisfactorily the long-term sediment catchment yield with a relative error of 2.6 percent. Also, the model has identified erosion sources spatially and has replicated some erosion processes as determined in other studies and field observations in the PRB. This result suggests that for catchments where sheet erosion is dominant SWAT model may substitute the sediment-rating curve. However, the SWAT model could not capture the dynamics of sediment load delivery in some seasons to the catchment outlet.

  13. A GUIDED SWAT MODEL APPLICATION ON SEDIMENT YIELD MODELING IN PANGANI RIVER BASIN: LESSONS LEARNT

    Directory of Open Access Journals (Sweden)

    Preksedis M. Ndomba

    2008-01-01

    Full Text Available The overall objective of this paper is to report on the lessons learnt from applying Soil and Water Assessment Tool (SWAT in a well guided sediment yield modelling study. The study area is the upstream of Pangani River Basin (PRB, the Nyumba Ya Mungu (NYM reservoir catchment, located in the North Eastern part of Tanzania. It should be noted that, previous modeling exercises in the region applied SWAT with preassumption that inter-rill or sheet erosion was the dominant erosion type. In contrast, in this study SWAT model application was guided by results of analysis of high temporal resolution of sediment flow data and hydro-meteorological data. The runoff component of the SWAT model was calibrated from six-years (i.e. 1977¿1982 of historical daily streamflow data. The sediment component of the model was calibrated using one-year (1977-1988 daily sediment loads estimated from one hydrological year sampling programme (between March and November, 2005 rating curve. A long-term period over 37 years (i.e. 1969-2005 simulation results of the SWAT model was validated to downstream NYM reservoir sediment accumulation information. The SWAT model captured 56 percent of the variance (CE and underestimated the observed daily sediment loads by 0.9 percent according to Total Mass Control (TMC performance indices during a normal wet hydrological year, i.e., between November 1, 1977 and October 31, 1978, as the calibration period. SWAT model predicted satisfactorily the long-term sediment catchment yield with a relative error of 2.6 percent. Also, the model has identified erosion sources spatially and has replicated some erosion processes as determined in other studies and field observations in the PRB. This result suggests that for catchments where sheet erosion is dominant SWAT model may substitute the sediment-rating curve. However, the SWAT model could not capture the dynamics of sediment load delivery in some seasons to the catchment outlet.

  14. Canadian Whole-Farm Model Holos - Development, Stakeholder Involvement, and Model Application

    Science.gov (United States)

    Kroebel, R.; Janzen, H.; Beauchemin, K. A.

    2017-12-01

    Agriculture and Agri-Food Canada's Holos model, based mostly on emission factors, aims to explore the effect of management on Canadian whole-farm greenhouse gas emissions. The model includes 27 commonly grown annual and perennial crops, summer fallow, grassland, and 8 types of tree plantings, along with beef, dairy, sheep, swine and other livestock or poultry operations. Model outputs encompass net emissions of CO2, CH4, and N2O (in CO2 equivalents), calculated for various farm components. Where possible, algorithms are drawn from peer-reviewed publications. For consistency, Holos is aligned with the Canadian sustainability indicator and national greenhouse gas inventory objectives. Although primarily an exploratory tool for research, the model's design makes it accessible and instructive also to agricultural producers, educators, and policy makers. Model development, therefore, proceeds iteratively, with extensive stakeholder feedback from training sessions or annual workshops. To make the model accessible to diverse users, the team developed a multi-layered interface, with general farming scenarios for general use, but giving access to detailed coefficients and assumptions to researchers. The model relies on extensive climate, soil, and agronomic databases to populate regionally-applicable default values thereby minimizing keyboard entries. In an initial application, the model was used to assess greenhouse gas emissions from the Canadian beef production system; it showed that enteric methane accounted for 63% of total GHG emissions and that 84% of emissions originated from the cow-calf herd. The model further showed that GHG emission intensity per kg beef, nationally, declined by 14% from 1981 to 2011, owing to gains in production efficiency. Holos is now being used to consider further potential advances through improved rations or other management options. We are now aiming to expand into questions of grazing management, and are developing a novel carbon

  15. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    Science.gov (United States)

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  16. Optimisation of Ionic Models to Fit Tissue Action Potentials: Application to 3D Atrial Modelling

    Directory of Open Access Journals (Sweden)

    Amr Al Abed

    2013-01-01

    Full Text Available A 3D model of atrial electrical activity has been developed with spatially heterogeneous electrophysiological properties. The atrial geometry, reconstructed from the male Visible Human dataset, included gross anatomical features such as the central and peripheral sinoatrial node (SAN, intra-atrial connections, pulmonary veins, inferior and superior vena cava, and the coronary sinus. Membrane potentials of myocytes from spontaneously active or electrically paced in vitro rabbit cardiac tissue preparations were recorded using intracellular glass microelectrodes. Action potentials of central and peripheral SAN, right and left atrial, and pulmonary vein myocytes were each fitted using a generic ionic model having three phenomenological ionic current components: one time-dependent inward, one time-dependent outward, and one leakage current. To bridge the gap between the single-cell ionic models and the gross electrical behaviour of the 3D whole-atrial model, a simplified 2D tissue disc with heterogeneous regions was optimised to arrive at parameters for each cell type under electrotonic load. Parameters were then incorporated into the 3D atrial model, which as a result exhibited a spontaneously active SAN able to rhythmically excite the atria. The tissue-based optimisation of ionic models and the modelling process outlined are generic and applicable to image-based computer reconstruction and simulation of excitable tissue.

  17. The application of chemical leasing business models in Mexico.

    Science.gov (United States)

    Schwager, Petra; Moser, Frank

    2006-03-01

    being achieved through the development of company specific business models that implement the above-indicated Chemical Leasing concept with the support from the Mexican National Cleaner Production Centre (NCPC). The implementation of Chemical Leasing in Mexico has proven to be an efficient instrument in enhancing sustainable chemical management and significantly reducing emissions in Mexico. Several companies from the chemical industrial sector implement or agreed to implement chemical leasing business models. Based on the positive findings of the project, several Mexican companies started to negotiate contents of possible Chemical Leasing contracts with suitable business partners. The project further aimed at disseminating information on Chemical Leasing. It successfully attracted globally operating companies in the chemicals sector to explore possibilities to implement Chemical Leasing business models in Mexico. At the international level, the results of the UNIDO project were presented on 20th September 2005 during a side event of the Strategic Approach to International Chemicals Management (SAICM) Preparation Conference in Vienna. To facilitate the promotion and application of Chemical Leasing project at international level, UNIDO is currently developing a number of tools to standardize Chemical Leasing projects. These include, among others, Chemical leasing contract models; Chemical Leasing data base to find partners for chemical leasing; and guidelines to implement Chemical Leasing projects and work programmes.

  18. Comparing Job Applicants to Non-applicants Using an Item-level Bifactor Model on the HEXACO Personality Inventory

    NARCIS (Netherlands)

    Anglim, Jeromy; Morse, Gavin; De Vries, Reinout E.; MacCann, Carolyn; Marty, Andrew

    2017-01-01

    The present study evaluated the ability of item-level bifactor models (a) to provide an alternative explanation to current theories of higher order factors of personality and (b) to explain socially desirable responding in both job applicant and non-applicant contexts. Participants (46% male; mean

  19. Modeling of Photonic Band Gap Crystals and Applications

    Energy Technology Data Exchange (ETDEWEB)

    El-Kady, Ihab Fathy [Iowa State Univ., Ames, IA (United States)

    2002-01-01

    In this work, the authors have undertaken a theoretical approach to the complex problem of modeling the flow of electromagnetic waves in photonic crystals. The focus is to address the feasibility of using the exciting phenomena of photonic gaps (PBG) in actual applications. The authors start by providing analytical derivations of the computational electromagnetic methods used in their work. They also present a detailed explanation of the physics underlying each approach, as well as a comparative study of the strengths and weaknesses of each method. The Plane Wave expansion, Transfer Matrix, and Finite Difference time Domain Methods are addressed. They also introduce a new theoretical approach, the Modal Expansion Method. They then shift the attention to actual applications. They begin with a discussion of 2D photonic crystal wave guides. The structure addressed consists of a 2D hexagonal structure of air cylinders in a layered dielectric background. Comparison with the performance of a conventional guide is made, as well as suggestions for enhancing it. The studies provide an upper theoretical limit on the performance of such guides, as they assumed no crystal imperfections and non-absorbing media. Next, they study 3D metallic PBG materials at near infrared and optical wavelengths. The main objective is to study the importance of absorption in the metal and the suitability of observing photonic band gaps in such structures. They study simple cubic structures where the metallic scatters are either cubes or interconnected metallic rods. Several metals are studied (aluminum, gold, copper, and silver). The effect of topology is addressed and isolated metallic cubes are found to be less lossy than the connected rod structures. The results reveal that the best performance is obtained by choosing metals with a large negative real part of the dielectric function, together with a relatively small imaginary part. Finally, they point out a new direction in photonic crystal

  20. RF tunable devices and subsystems methods of modeling, analysis, and applications methods of modeling, analysis, and applications

    CERN Document Server

    Gu, Qizheng

    2015-01-01

    This book serves as a hands-on guide to RF tunable devices, circuits and subsystems. An innovative method of modeling for tunable devices and networks is described, along with a new tuning algorithm, adaptive matching network control approach, and novel filter frequency automatic control loop.  The author provides readers with the necessary background and methods for designing and developing tunable RF networks/circuits and tunable RF font-ends, with an emphasis on applications to cellular communications. ·      Discusses the methods of characterizing, modeling, analyzing, and applying RF tunable devices and subsystems; ·      Explains the necessary methods of utilizing RF tunable devices and subsystems, rather than discussing the RF tunable devices themselves; ·      Presents and applies methods for MEMS tunable capacitors, which can be used for any RF tunable device; ·      Uses analytic methods wherever possible and provides numerous, closed-form solutions; ·      Includ...

  1. Integrated Medical Model Project - Overview and Summary of Historical Application

    Science.gov (United States)

    Myers, J.; Boley, L.; Butler, D.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; hide

    2015-01-01

    Introduction: The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project. Methods: Figure 1 [see document] illustrates the IMM modeling system and scenario process. As illustrated, the IMM computational architecture is based on Probabilistic Risk Assessment techniques. Nineteen assumptions and limitations define the IMM application domain. Scenario definitions include crew medical attributes and mission specific details. The IMM forecasts probabilities of loss of crew life (LOCL), evacuation (EVAC), quality time lost during the mission, number of medical resources utilized and the number and type of medical events by combining scenario information with in-flight, analog, and terrestrial medical information stored in the iMED. In addition, the metrics provide the integrated information necessary to estimate optimized in-flight medical kit contents under constraints of mass and volume or acceptable level of mission risk. Results and Conclusions

  2. Advancement of Global-scale River Hydrodynamics Modelling and Its Potential Applications to Earth System Models

    Science.gov (United States)

    Yamazaki, D.

    2015-12-01

    Global river routine models have been developed for representing freshwater discharge from land to ocean in Earth System Models. At the beginning, global river models had simulated river discharge along a prescribed river network map by using a linear-reservoir assumption. Recently, in parallel with advancement of remote sensing and computational powers, many advanced global river models have started to represent floodplain inundation assuming sub-grid floodplain topography. Some of them further pursue physically-appropriate representation of river and floodplain dynamics, and succeeded to utilize "hydrodynamic flow equations" to realistically simulate channel/floodplain and upstream/downstream interactions. State-of-the-art global river hydrodynamic models can well reproduce flood stage (e.g. inundated areas and water levels) in addition to river discharge. Flood stage simulation by global river models can be potentially coupled with land surface processes in Earth System Models. For example, evaporation from inundated water area is not negligible for land-atmosphere interactions in arid areas (such as the Niger River). Surface water level and ground water level are correlated each other in flat topography, and this interaction could dominate wetting and drying of many small lakes in flatland and could also affect biogeochemical processes in these lakes. These land/surface water interactions had not been implemented in Earth System Models but they have potential impact on the global climate and carbon cycle. In the AGU presentation, recent advancements of global river hydrodynamic modelling, including super-high resolution river topography datasets, will be introduces. The potential applications of river and surface water modules within Earth System Models will be also discussed.

  3. Nuclear security culture: a generic model for universal application

    International Nuclear Information System (INIS)

    Khripunov, I.

    2005-01-01

    Full text: Nuclear security culture found its way into professional parlance several years ago, but still lacks an agreed-upon definition and description. The February 2005 U.S.-Russian Joint Statement, issued at the presidential summit meeting in Bratislava, referred specifically to security culture, focusing renewed attention on the concept. Numerous speakers at the March 2005 International Atomic Energy Agency's (IAEA) international conference on nuclear security referred to security culture, but their visions and interpretations were often at odds with one another. Clearly, there is a need for a generic model of nuclear security culture with universal applicability. Internationally acceptable standards in this area would be invaluable for evaluation, comparison, cooperation, and assistance. They would also help international bodies better manage their relations with the nuclear sectors in various countries. This paper will develop such a model. It will use the IAEA definition of nuclear security, and then apply Edgar Schein's model of organizational culture to security culture at a generic nuclear facility. A cultural approach to physical protection involves determining what attitudes and beliefs need to be established in an organization, how these attitudes and beliefs manifest themselves in the behavior of assigned personnel, and how desirable attitudes and beliefs can be transcribed into formal working methods to produce good outcomes, i.e., effective protection. The security-culture mechanism I will propose is broken into four major units: facility leadership, proactive policies and procedures, personnel performance, and learning and professional improvement. The paper will amplify on the specific traits characteristic of each of these units. Security culture is not a panacea. In a time of mounting terrorist threats, it should nonetheless be looked upon as a necessary organizational tool that enhances the skills of nuclear personnel and ensures that

  4. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    Science.gov (United States)

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  5. Solving Enterprise Applications Performance Puzzles Queuing Models to the Rescue

    CERN Document Server

    Grinshpan, Leonid

    2012-01-01

    A groundbreaking scientific approach to solving enterprise applications performance problems Enterprise applications are the information backbone of today's corporations, supporting vital business functions such as operational management, supply chain maintenance, customer relationship administration, business intelligence, accounting, procurement logistics, and more. Acceptable performance of enterprise applications is critical for a company's day-to-day operations as well as for its profitability. Unfortunately, troubleshooting poorly performing enterprise applications has traditionally

  6. Reliable real-time applications - and how to use tests to model and understand

    DEFF Research Database (Denmark)

    Jensen, Peter Krogsgaard

    Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application......Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application...

  7. Towards metagenome-scale models for industrial applications-the case of Lactic Acid Bacteria

    NARCIS (Netherlands)

    Branco Dos Santos, F.; Vos, de W.M.; Teusink, B.

    2013-01-01

    We review the uses and limitations of modelling approaches that are in use in the field of Lactic Acid Bacteria (LAB). We describe recent developments in model construction and computational methods, starting from application of such models to monocultures. However, since most applications in food

  8. Application of a LUTI model for the assessment of land use plans and public transport investments

    NARCIS (Netherlands)

    de Bok, Michiel; Geurs, Karst Teunis; Zondag, Barry; Viegas, J.M.; Macario, R.

    2010-01-01

    Integrated land-use and transport interaction models (LUTI) are praised for their ability to evaluate land-use and transport planning in an integrated and consistent modeling system. However, applications of empirically estimated land use models are rare. This paper will present the application of

  9. Towards metagenome-scale models for industrial applications - the case of Lactic Acid Bacteria

    NARCIS (Netherlands)

    Branco dos Santos, F.; de Vos, W.M.; Teusink, B.

    We review the uses and limitations of modelling approaches that are in use in the field of Lactic Acid Bacteria (LAB). We describe recent developments in model construction and computational methods, starting from application of such models to monocultures. However, since most applications in food

  10. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    NARCIS (Netherlands)

    Zee, van der F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy

  11. Erosion and Sediment Transport Modelling in Shallow Waters: A Review on Approaches, Models and Applications.

    Science.gov (United States)

    Hajigholizadeh, Mohammad; Melesse, Assefa M; Fuentes, Hector R

    2018-03-14

    The erosion and sediment transport processes in shallow waters, which are discussed in this paper, begin when water droplets hit the soil surface. The transport mechanism caused by the consequent rainfall-runoff process determines the amount of generated sediment that can be transferred downslope. Many significant studies and models are performed to investigate these processes, which differ in terms of their effecting factors, approaches, inputs and outputs, model structure and the manner that these processes represent. This paper attempts to review the related literature concerning sediment transport modelling in shallow waters. A classification based on the representational processes of the soil erosion and sediment transport models (empirical, conceptual, physical and hybrid) is adopted, and the commonly-used models and their characteristics are listed. This review is expected to be of interest to researchers and soil and water conservation managers who are working on erosion and sediment transport phenomena in shallow waters. The paper format should be helpful for practitioners to identify and generally characterize the types of available models, their strengths and their basic scope of applicability.

  12. Modelling radiation fluxes in simple and complex environments--application of the RayMan model.

    Science.gov (United States)

    Matzarakis, Andreas; Rutz, Frank; Mayer, Helmut

    2007-03-01

    The most important meteorological parameter affecting the human energy balance during sunny weather conditions is the mean radiant temperature T(mrt). It considers the uniform temperature of a surrounding surface giving off blackbody radiation, which results in the same energy gain of a human body given the prevailing radiation fluxes. This energy gain usually varies considerably in open space conditions. In this paper, the model 'RayMan', used for the calculation of short- and long-wave radiation fluxes on the human body, is presented. The model, which takes complex urban structures into account, is suitable for several applications in urban areas such as urban planning and street design. The final output of the model is, however, the calculated T(mrt), which is required in the human energy balance model, and thus also for the assessment of the urban bioclimate, with the use of thermal indices such as predicted mean vote (PMV), physiologically equivalent temperature (PET) and standard effective temperature (SET*). The model has been developed based on the German VDI-Guidelines 3789, Part II (environmental meteorology, interactions between atmosphere and surfaces; calculation of short- and long-wave radiation) and VDI-3787 (environmental meteorology, methods for the human-biometeorological evaluation of climate and air quality for urban and regional planning. Part I: climate). The validation of the results of the RayMan model agrees with similar results obtained from experimental studies.

  13. Predictive market segmentation model: An application of logistic regression model and CHAID procedure

    Directory of Open Access Journals (Sweden)

    Soldić-Aleksić Jasna

    2009-01-01

    Full Text Available Market segmentation presents one of the key concepts of the modern marketing. The main goal of market segmentation is focused on creating groups (segments of customers that have similar characteristics, needs, wishes and/or similar behavior regarding the purchase of concrete product/service. Companies can create specific marketing plan for each of these segments and therefore gain short or long term competitive advantage on the market. Depending on the concrete marketing goal, different segmentation schemes and techniques may be applied. This paper presents a predictive market segmentation model based on the application of logistic regression model and CHAID analysis. The logistic regression model was used for the purpose of variables selection (from the initial pool of eleven variables which are statistically significant for explaining the dependent variable. Selected variables were afterwards included in the CHAID procedure that generated the predictive market segmentation model. The model results are presented on the concrete empirical example in the following form: summary model results, CHAID tree, Gain chart, Index chart, risk and classification tables.

  14. Erosion and Sediment Transport Modelling in Shallow Waters: A Review on Approaches, Models and Applications

    Science.gov (United States)

    Fuentes, Hector R.

    2018-01-01

    The erosion and sediment transport processes in shallow waters, which are discussed in this paper, begin when water droplets hit the soil surface. The transport mechanism caused by the consequent rainfall-runoff process determines the amount of generated sediment that can be transferred downslope. Many significant studies and models are performed to investigate these processes, which differ in terms of their effecting factors, approaches, inputs and outputs, model structure and the manner that these processes represent. This paper attempts to review the related literature concerning sediment transport modelling in shallow waters. A classification based on the representational processes of the soil erosion and sediment transport models (empirical, conceptual, physical and hybrid) is adopted, and the commonly-used models and their characteristics are listed. This review is expected to be of interest to researchers and soil and water conservation managers who are working on erosion and sediment transport phenomena in shallow waters. The paper format should be helpful for practitioners to identify and generally characterize the types of available models, their strengths and their basic scope of applicability. PMID:29538335

  15. Models to Study NK Cell Biology and Possible Clinical Application.

    Science.gov (United States)

    Zamora, Anthony E; Grossenbacher, Steven K; Aguilar, Ethan G; Murphy, William J

    2015-08-03

    Natural killer (NK) cells are large granular lymphocytes of the innate immune system, responsible for direct targeting and killing of both virally infected and transformed cells. NK cells rapidly recognize and respond to abnormal cells in the absence of prior sensitization due to their wide array of germline-encoded inhibitory and activating receptors, which differs from the receptor diversity found in B and T lymphocytes that is due to the use of recombination-activation gene (RAG) enzymes. Although NK cells have traditionally been described as natural killers that provide a first line of defense prior to the induction of adaptive immunity, a more complex view of NK cells is beginning to emerge, indicating they may also function in various immunoregulatory roles and have the capacity to shape adaptive immune responses. With the growing appreciation for the diverse functions of NK cells, and recent technological advancements that allow for a more in-depth understanding of NK cell biology, we can now begin to explore new ways to manipulate NK cells to increase their clinical utility. In this overview unit, we introduce the reader to various aspects of NK cell biology by reviewing topics ranging from NK cell diversity and function, mouse models, and the roles of NK cells in health and disease, to potential clinical applications. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  16. CRISPR-Cas9 technology: applications and human disease modelling.

    Science.gov (United States)

    Torres-Ruiz, Raul; Rodriguez-Perales, Sandra

    2017-01-01

    Genome engineering is a powerful tool for a wide range of applications in biomedical research and medicine. The development of the clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 system has revolutionized the field of gene editing, thus facilitating efficient genome editing through the creation of targeted double-strand breaks of almost any organism and cell type. In addition, CRISPR-Cas9 technology has been used successfully for many other purposes, including regulation of endogenous gene expression, epigenome editing, live-cell labelling of chromosomal loci, edition of single-stranded RNA and high-throughput gene screening. The implementation of the CRISPR-Cas9 system has increased the number of available technological alternatives for studying gene function, thus enabling generation of CRISPR-based disease models. Although many mechanistic questions remain to be answered and several challenges have yet to be addressed, the use of CRISPR-Cas9-based genome engineering technologies will increase our knowledge of disease processes and their treatment in the near future. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Energy materials. Advances in characterization, modelling and application

    International Nuclear Information System (INIS)

    Andersen, N.H.; Eldrup, M.; Hansen, N.; Juul Jensen, D.; Nielsen, E.M.; Nielsen, S.F.; Soerensen, B.F.; Pedersen, A.S.; Vegge, T.; West, S.S.

    2008-01-01

    Energy-related topics in the modern world and energy research programmes cover the range from basic research to applications and structural length scales from micro to macro. Materials research and development is a central part of the energy area as break-throughs in many technologies depend on a successful development and validation of new or advanced materials. The Symposium is organized by the Materials Research Department at Risoe DTU - National Laboratory for Sustainable Energy. The Department concentrates on energy problems combining basic and applied materials research with special focus on the key topics: wind, fusion, superconductors and hydrogen. The symposium is based on these key topics and focus on characterization of materials for energy applying neutron, X-ray and electron diffraction. Of special interest is research carried out at large facilities such as reactors and synchrotrons, supplemented by other experimental techniques and modelling on different length scales that underpins experiments. The Proceedings contain 15 key note presentations and 30 contributed presentations, covering the abovementioned key topics relevant for the energy materials. The contributions clearly show the importance of materials research when developing sustainable energy technologies and also that many challenges remain to be approached. (BA)

  18. In vitro models of cancer stem cells and clinical applications.

    Science.gov (United States)

    S Franco, Sara; Szczesna, Karolina; Iliou, Maria S; Al-Qahtani, Mohammed; Mobasheri, Ali; Kobolák, Julianna; Dinnyés, András

    2016-09-30

    Cancer cells, stem cells and cancer stem cells have for a long time played a significant role in the biomedical sciences. Though cancer therapy is more effective than it was a few years ago, the truth is that still none of the current non-surgical treatments can cure cancer effectively. The reason could be due to the subpopulation called "cancer stem cells" (CSCs), being defined as those cells within a tumour that have properties of stem cells: self-renewal and the ability for differentiation into multiple cell types that occur in tumours.The phenomenon of CSCs is based on their resistance to many of the current cancer therapies, which results in tumour relapse. Although further investigation regarding CSCs is still needed, there is already evidence that these cells may play an important role in the prognosis of cancer, progression and therapeutic strategy. Therefore, long-term patient survival may depend on the elimination of CSCs. Consequently, isolation of pure CSC populations or reprogramming of cancer cells into CSCs, from cancer cell lines or primary tumours, would be a useful tool to gain an in-depth knowledge about heterogeneity and plasticity of CSC phenotypes and therefore carcinogenesis. Herein, we will discuss current CSC models, methods used to characterize CSCs, candidate markers, characteristic signalling pathways and clinical applications of CSCs. Some examples of CSC-specific treatments that are currently in early clinical phases will also be presented in this review.

  19. Application of Total Productivity Model within Croatia Airlines

    Directory of Open Access Journals (Sweden)

    Željko Radačić

    2005-09-01

    Full Text Available By defining and selecting adequate factors of the total productivitymodel and by assigning specific relevance of each factor,the initial preconditions for the analysis and monitoring ofthe model application efficiency within the Croatia Airlinesbusiness policy have been established. Since the majority of theanalyzed factors have realized a more intensive growth thanplanned, the business year 2004 can be assessed as the mostsuccessful one in the Croatia Airlines history. Consequently,the difference related to the productivity indicators of the Associationof European Airlines has been reduced, particularly theaircraft productivity with remnant of 5 to 10 percent, and theproductivity of the employees with a remnant of 15 to 20 percent,and the productivity of fuel expressed as quantity at AEAlevel, and expressed as value below that level. Finally, althoughthere is no expressed correlation between the quantitative productivityindicators and business profitability, the highest realizednet profit since the foundation of Croatia Airlines fullysupplements the solid level of the comparison indicators, confirmingits complete readiness and maturity to join the Star Alliance.

  20. Modelling and applications in mathematics education the 14th ICMI study

    CERN Document Server

    Galbraith, Peter L; Niss, Mogens

    2007-01-01

    The book aims at showing the state-of-the-art in the field of modeling and applications in mathematics education. This is the first volume to do this. The book deals with the question of how key competencies of applications and modeling at the heart of mathematical literacy may be developed; with the roles that applications and modeling may play in mathematics teaching, making mathematics more relevant for students.

  1. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  2. The NASA Lightning Nitrogen Oxides Model (LNOM): Application to Air Quality Modeling

    Science.gov (United States)

    Koshak, William; Peterson, Harold; Khan, Maudood; Biazar, Arastoo; Wang, Lihua

    2011-01-01

    Recent improvements to the NASA Marshall Space Flight Center Lightning Nitrogen Oxides Model (LNOM) and its application to the Community Multiscale Air Quality (CMAQ) modeling system are discussed. The LNOM analyzes Lightning Mapping Array (LMA) and National Lightning Detection Network(TradeMark)(NLDN) data to estimate the raw (i.e., unmixed and otherwise environmentally unmodified) vertical profile of lightning NO(x) (= NO + NO2). The latest LNOM estimates of lightning channel length distributions, lightning 1-m segment altitude distributions, and the vertical profile of lightning NO(x) are presented. The primary improvement to the LNOM is the inclusion of non-return stroke lightning NOx production due to: (1) hot core stepped and dart leaders, (2) stepped leader corona sheath, K-changes, continuing currents, and M-components. The impact of including LNOM-estimates of lightning NO(x) for an August 2006 run of CMAQ is discussed.

  3. Peripheral arterial disease: application of the chronic care model.

    Science.gov (United States)

    Lovell, Marge; Myers, Kathryn; Forbes, Thomas L; Dresser, George; Weiss, Ed

    2011-12-01

    Management of chronic diseases is one of the greatest challenges facing health care professionals globally. With the aging population increasing worldwide, the number of patients afflicted with chronic diseases will increase. Peripheral Arterial Disease (PAD) is a common, chronic atherosclerotic vascular disease that is associated with a high risk of stroke, myocardial infarction and cardiovascular death. The objective of this study was to determine if a multidisciplinary Vascular Risk Management Clinic (VRMC) would improve risk factor management and health outcomes for patients with PAD with poorly-controlled risk factors. A multidisciplinary VRMC was established utilizing a novel application of the Chronic Care Model to meet the needs of PAD patients. Interventions included optimization of medical therapy, investigations for undiagnosed atherosclerosis in other vascular distributions, smoking cessation therapy, dietary assessment and counseling, and active involvement of patients in evaluating progress towards their risk factor target goals. Assessment of risk factor control was done at each clinic visit and included measures of symptom severity, blood pressure, fasting blood sugar (FBS), lipid profile, body mass index (BMI), and smoking status. Analysis of risk factors was performed for the first 103 patients followed in the clinic. Average follow-up time was 528 days, and statistically significant improvements were seen in blood pressure, LDL, HDL, total cholesterol (TC), and TC/HDL ratio, while BMI, FBS, and triglycerides remained stable. Participation in a specialized vascular risk management clinic resulted in significant improvement in risk factors for disease progression compared to baseline status. Copyright © 2011 Society for Vascular Nursing, Inc. Published by Mosby, Inc. All rights reserved.

  4. Analysing model fit of psychometric process models: An overview, a new test and an application to the diffusion model.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2017-05-01

    Cognitive psychometric models embed cognitive process models into a latent trait framework in order to allow for individual differences. Due to their close relationship to the response process the models allow for profound conclusions about the test takers. However, before such a model can be used its fit has to be checked carefully. In this manuscript we give an overview over existing tests of model fit and show their relation to the generalized moment test of Newey (Econometrica, 53, 1985, 1047) and Tauchen (J. Econometrics, 30, 1985, 415). We also present a new test, the Hausman test of misspecification (Hausman, Econometrica, 46, 1978, 1251). The Hausman test consists of a comparison of two estimates of the same item parameters which should be similar if the model holds. The performance of the Hausman test is evaluated in a simulation study. In this study we illustrate its application to two popular models in cognitive psychometrics, the Q-diffusion model and the D-diffusion model (van der Maas, Molenaar, Maris, Kievit, & Boorsboom, Psychol Rev., 118, 2011, 339; Molenaar, Tuerlinckx, & van der Maas, J. Stat. Softw., 66, 2015, 1). We also compare the performance of the test to four alternative tests of model fit, namely the M 2 test (Molenaar et al., J. Stat. Softw., 66, 2015, 1), the moment test (Ranger et al., Br. J. Math. Stat. Psychol., 2016) and the test for binned time (Ranger & Kuhn, Psychol. Test. Asess. , 56, 2014b, 370). The simulation study indicates that the Hausman test is superior to the latter tests. The test closely adheres to the nominal Type I error rate and has higher power in most simulation conditions. © 2017 The British Psychological Society.

  5. Usability evaluation model for mobile e-book applications

    Science.gov (United States)

    Matraf, Munya Saleh Ba; Hussain, Azham

    2017-10-01

    Evaluation for mobile e-book applications are limited and did not address all the important usability measurements. Hence, this study aimed to identify the characteristics that affect user satisfaction on the usability of mobile e-book applications. Five characteristics that have a significant effect on the user satisfaction of mobile e-book applications have been identified namely readability, effectiveness, accessibility, efficiency, and navigation. A usability evaluation was conducted on three mobile e-book applications namely Adobe Acrobat Reader, Ebook Reader, and Amazon Kindle. 30 students from Universiti Utara Malaysia evaluated the mobile e-book applications and their satisfaction was measured using questionnaire. The outcomes discovered that the five characteristics have a significant positive relationship with user satisfaction. This provides insights into the main characteristics that increase user satisfaction.

  6. The application of single particle hydrodynamics in continuum models of multiphase flow

    Science.gov (United States)

    Decker, Rand

    1988-01-01

    A review of the application of single particle hydrodynamics in models for the exchange of interphase momentum in continuum models of multiphase flow is presented. Considered are the equations of motion for a laminar, mechanical two phase flow. Inherent to this theory is a model for the interphase exchange of momentum due to drag between the dispersed particulate and continuous fluid phases. In addition, applications of two phase flow theory to de-mixing flows require the modeling of interphase momentum exchange due to lift forces. The applications of single particle analysis in deriving models for drag and lift are examined.

  7. Modeling of PWR plant by multilevel flow model and its application in fault diagnosis

    International Nuclear Information System (INIS)

    Ouyang, Jun; Yoshikawa, Hidekazu; Zhou, Yangping; Yang Ming

    2005-01-01

    The paper describes the application of Multilevel Flow Modeling (MFM) - a modeling method in means-end and part-whole way, in interface design of supervisory control of Pressurized Water Reactor (PWR) plant, and automatic real-time fault diagnosis of PWR accident. The MFM decomposes the complex plant process from the main goal to each component at multiple levels to represent the contribution of each component to the whole system to make clear how the main goal of the system is achieved. The plant process is described abstractly in function level by mass, energy and information flows, which represent the interaction between different components and enable the causal reasoning between functions according to the flow properties. Thus, in the abnormal status, a goal-function-component oriented fault diagnosis can be performed with the model at a very quick speed and abnormal alarms can be fully explained by the reasoning relationship of the model. In this paper, an interface design of the PWR plant is built by the conception of means-end nad part-whole by using MFM, and several simulation cases are used for evaluating the fault diagnosis performance. The results show that the system has a good ability to detect and diagnose accidents timely before reactor trip. (author)

  8. Temperature-based modeling of reference evapotranspiration using several artificial intelligence models: application of different modeling scenarios

    Science.gov (United States)

    Sanikhani, Hadi; Kisi, Ozgur; Maroufpoor, Eisa; Yaseen, Zaher Mundher

    2018-02-01

    The establishment of an accurate computational model for predicting reference evapotranspiration (ET0) process is highly essential for several agricultural and hydrological applications, especially for the rural water resource systems, water use allocations, utilization and demand assessments, and the management of irrigation systems. In this research, six artificial intelligence (AI) models were investigated for modeling ET0 using a small number of climatic data generated from the minimum and maximum temperatures of the air and extraterrestrial radiation. The investigated models were multilayer perceptron (MLP), generalized regression neural networks (GRNN), radial basis neural networks (RBNN), integrated adaptive neuro-fuzzy inference systems with grid partitioning and subtractive clustering (ANFIS-GP and ANFIS-SC), and gene expression programming (GEP). The implemented monthly time scale data set was collected at the Antalya and Isparta stations which are located in the Mediterranean Region of Turkey. The Hargreaves-Samani (HS) equation and its calibrated version (CHS) were used to perform a verification analysis of the established AI models. The accuracy of validation was focused on multiple quantitative metrics, including root mean squared error (RMSE), mean absolute error (MAE), correlation coefficient (R 2), coefficient of residual mass (CRM), and Nash-Sutcliffe efficiency coefficient (NS). The results of the conducted models were highly practical and reliable for the investigated case studies. At the Antalya station, the performance of the GEP and GRNN models was better than the other investigated models, while the performance of the RBNN and ANFIS-SC models was best compared to the other models at the Isparta station. Except for the MLP model, all the other investigated models presented a better performance accuracy compared to the HS and CHS empirical models when applied in a cross-station scenario. A cross-station scenario examination implies the

  9. Physical model for GaN HEMT design optimization in high frequency switching applications

    OpenAIRE

    Cucak, Dejana; Vasic, Miroslav; García, Oscar; Bouvier, Yann; Oliver Ramírez, Jesús Angel; Alou Cervera, Pedro; Cobos Márquez, José Antonio; Wang, Ashu; Martin Horcajo, Sara; Romero Rojo, Fátima; Calle Gómez, Fernando

    2014-01-01

    In this paper, physical modeling of a GaN HEMT is proposed, with the objective of device design optimization for application in a high frequency DC/DC converter. From the point of view of a switching application, physical model for input, output and reverse capacitance as well as for channel resistance is very important, since the aforementioned parameters determine power losses in the circuit. The obtained physical model of the switching device can be used for simulation models such as PSpic...

  10. Development of Shear Deformable Laminated Shell Element and Its Application to ANCF Tire Model

    Science.gov (United States)

    2015-04-24

    DEFORMABLE LAMINATED SHELL ELEMENT AND ITS APPLICATION TO ANCF TIRE MODEL Hiroki Yamashita Department of Mechanical and Industrial Engineering...for application to the modeling of fiber-reinforced rubber (FRR) structure of the physics-based ANCF tire model. The complex deformation coupling...cornering forces. Since a tire consists of layers of plies and steel belts embedded in rubber , the tire structure needs to be modeled by cord- rubber

  11. Extensions and Applications of the Cox-Aalen Survival Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2003-01-01

    Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects......Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects...

  12. A parametric daily precipitation model application in Botswana ...

    African Journals Online (AJOL)

    Results show that Markov-chain (MC) model can be used to model the persistence behaviour of the transition probability matrix (TPM) of dry and wet day rainfall sequences. With the MC model, the two-parameter gamma distribution is found to be most robust and suitable model to describe the magnitude of rainfall depths in ...

  13. New weighted sum of gray gases model applicable to Computational Fluid Dynamics (CFD) modeling of oxy-fuel combustion

    DEFF Research Database (Denmark)

    Yin, Chungen; Johansen, Lars Christian Riis; Rosendahl, Lasse

    2010-01-01

    gases model (WSGGM) is derived, which is applicable to computational fluid dynamics (CFD) modeling of both air-fuel and oxy-fuel combustion. First, a computer code is developed to evaluate the emissivity of any gas mixture at any condition by using the exponential wide band model (EWBM...

  14. Bayesian uncertainty analysis with applications to turbulence modeling

    International Nuclear Information System (INIS)

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  15. Physically unclonable functions (PUFs) applications, models, and future directions

    CERN Document Server

    Wachsmann, Christian

    2014-01-01

    Today, embedded systems are used in many security-critical applications, from access control, electronic tickets, sensors, and smart devices (e.g., wearables) to automotive applications and critical infrastructures. These systems are increasingly used to produce and process both security-critical and privacy-sensitive data, which bear many security and privacy risks. Establishing trust in the underlying devices and making them resistant to software and hardware attacks is a fundamental requirement in many applications and a challenging, yet unsolved, task. Solutions solely based on software ca

  16. Multi-physics modeling in electrical engineering. Application to a magneto-thermo-mechanical model

    International Nuclear Information System (INIS)

    Journeaux, Antoine

    2013-01-01

    The modeling of multi-physics problems in electrical engineering is presented, with an application to the numerical computation of vibrations within the end windings of large turbo-generators. This study is divided into four parts: the impositions of current density, the computation of local forces, the transfer of data between disconnected meshes, and the computation of multi-physics problems using weak coupling, Firstly, the representation of current density within numerical models is presented. The process is decomposed into two stages: the construction of the initial current density, and the determination of a divergence-free field. The representation of complex geometries makes the use of analytical methods impossible. A method based on an electrokinetic problem is used and a fully geometrical method are tested. The geometrical method produces results closer to the real current density than the electrokinetic problem. Methods to compute forces are numerous, and this study focuses on the virtual work principle and the Laplace force considering the recommendations of the literature. Laplace force is highly accurate but is applicable only if the permeability is uniform. The virtual work principle is finally preferred as it appears as the most general way to compute local forces. Mesh-to-mesh data transfer methods are developed to compute multi-physics models using multiples meshes adapted to the subproblems and multiple computational software. The interpolation method, a locally conservative projection, and an orthogonal projection are compared. Interpolation method is said to be fast but highly diffusive, and the orthogonal projections are highly accurate. The locally conservative method produces results similar to the orthogonal projection but avoid the assembly of linear systems. The numerical computation of multi-physical problems using multiple meshes and projections is then presented. However for a given class of problems, there is not an unique coupling

  17. An overview of topic modeling and its current applications in bioinformatics.

    Science.gov (United States)

    Liu, Lin; Tang, Lin; Dong, Wen; Yao, Shaowen; Zhou, Wei

    2016-01-01

    With the rapid accumulation of biological datasets, machine learning methods designed to automate data analysis are urgently needed. In recent years, so-called topic models that originated from the field of natural language processing have been receiving much attention in bioinformatics because of their interpretability. Our aim was to review the application and development of topic models for bioinformatics. This paper starts with the description of a topic model, with a focus on the understanding of topic modeling. A general outline is provided on how to build an application in a topic model and how to develop a topic model. Meanwhile, the literature on application of topic models to biological data was searched and analyzed in depth. According to the types of models and the analogy between the concept of document-topic-word and a biological object (as well as the tasks of a topic model), we categorized the related studies and provided an outlook on the use of topic models for the development of bioinformatics applications. Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers' ability to interpret biological information. Nevertheless, due to the lack of topic models optimized for specific biological data, the studies on topic modeling in biological data still have a long and challenging road ahead. We believe that topic models are a promising method for various applications in bioinformatics research.

  18. Conceptual model of an application and its use for application documentation

    Directory of Open Access Journals (Sweden)

    Martin Vonka

    2015-04-01

    Full Text Available Following article proposes methodology for conceptual design of a software application. This form of design is suitable for dynamic development environment and agile principles of software development. Article discus the required scope and style used for description of the application. Unification of a documentation significantly reduces the time required for communication within the development team. Some part of the documentation are obtained using the method of reverse engineering, for example by analysis of the application structure or its source code.

  19. The social networking application success model : An empirical study of Facebook and Twitter

    NARCIS (Netherlands)

    Ou, Carol; Davison, R.M.; Huang, Q.

    2016-01-01

    Social networking applications (SNAs) are among the fastest growing web applications of recent years. In this paper, we propose a causal model to assess the success of SNAs, grounded on DeLone and McLean’s updated information systems (IS) success model. In addition to their original three dimensions

  20. Multimedia Teleservices Modelled with the OSI Application Layer Structure

    NARCIS (Netherlands)

    van Rijssen, Erwin; Widya, I.A.; Michiels, E.F.; Hutchison, D.; Christiansen, H.; Coulson, G.; Danthine, A.A.S.

    This paper looks into the communications capabilities that are required by distributed multimedia applications to achieve relation preserving information exchange. These capabilities are derived by analyzing the notion of information exchange and are embodied in communications functionalities. To