WorldWideScience

Sample records for factor analysis model

  1. Multistructure Statistical Model Applied To Factor Analysis

    Science.gov (United States)

    Bentler, Peter M.

    1976-01-01

    A general statistical model for the multivariate analysis of mean and covariance structures is described. Matrix calculus is used to develop the statistical aspects of one new special case in detail. This special case separates the confounding of principal components and factor analysis. (DEP)

  2. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... surface than existing in the idealized model....

  3. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... but with clearly defined failure modes, the MCFM can bestarted from each idealized single mode limit state in turn to identify a locally most central point on the elaborate limitstate surface. Typically this procedure leads to a fewer number of locally most central failure points on the elaboratelimit state...... surface than existing in the idealized model....

  4. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  5. Connections between Graphical Gaussian Models and Factor Analysis

    Science.gov (United States)

    Salgueiro, M. Fatima; Smith, Peter W. F.; McDonald, John W.

    2010-01-01

    Connections between graphical Gaussian models and classical single-factor models are obtained by parameterizing the single-factor model as a graphical Gaussian model. Models are represented by independence graphs, and associations between each manifest variable and the latent factor are measured by factor partial correlations. Power calculations…

  6. A Bayesian semiparametric factor analysis model for subtype identification.

    Science.gov (United States)

    Sun, Jiehuan; Warren, Joshua L; Zhao, Hongyu

    2017-04-25

    Disease subtype identification (clustering) is an important problem in biomedical research. Gene expression profiles are commonly utilized to infer disease subtypes, which often lead to biologically meaningful insights into disease. Despite many successes, existing clustering methods may not perform well when genes are highly correlated and many uninformative genes are included for clustering due to the high dimensionality. In this article, we introduce a novel subtype identification method in the Bayesian setting based on gene expression profiles. This method, called BCSub, adopts an innovative semiparametric Bayesian factor analysis model to reduce the dimension of the data to a few factor scores for clustering. Specifically, the factor scores are assumed to follow the Dirichlet process mixture model in order to induce clustering. Through extensive simulation studies, we show that BCSub has improved performance over commonly used clustering methods. When applied to two gene expression datasets, our model is able to identify subtypes that are clinically more relevant than those identified from the existing methods.

  7. The FIRO model of family therapy: implications of factor analysis.

    Science.gov (United States)

    Hafner, R J; Ross, M W

    1989-11-01

    Schutz's FIRO model contains three main elements: inclusion, control, and affection. It is used widely in mental health research and practice, but has received little empirical validation. The present study is based on factor analysis of the resources to FIRO questionnaire of 120 normal couples and 191 couples who were attending a clinic for marital/psychiatric problems. Results confirmed the validity of the FIRO model for women only. The differences between the sexes reflected a considerable degree of sex-role stereotyping, the clinical implications of which are discussed.

  8. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  9. Quasi Maximum Likelihood Analysis of High Dimensional Constrained Factor Models

    OpenAIRE

    Li, Kunpeng; Li,Qi; Lu, Lina

    2016-01-01

    Factor models have been widely used in practice. However, an undesirable feature of a high dimensional factor model is that the model has too many parameters. An effective way to address this issue, proposed in a seminar work by Tsai and Tsay (2010), is to decompose the loadings matrix by a high-dimensional known matrix multiplying with a low-dimensional unknown matrix, which Tsai and Tsay (2010) name constrained factor models. This paper investigates the estimation and inferential theory ...

  10. Confirmatory factor analysis for the Eating Disorder Examination Questionnaire: Evidence supporting a three-factor model.

    Science.gov (United States)

    Barnes, Jennifer; Prescott, Tim; Muncer, Steven

    2012-12-01

    The purpose of this investigation was to compare the goodness-of-fit of a one factor model with the four factor model proposed by Fairburn (2008) and the three factor model proposed by Peterson and colleagues (2007) for the Eating Disorder Examination Questionnaire (EDE-Q 6.0) (Fairburn and Beglin, 1994). Using a cross-sectional design, the EDE-Q was completed by 569 adults recruited from universities and eating disorder charities in the UK. Confirmatory factor analysis (CFA) was carried out for both the student and non-student groups. CFA indicated that Peterson et al.'s (2007) three factor model was the best fit for both groups within the current data sample. Acceptable levels of internal reliability were observed and there was clear evidence for a hierarchical factor of eating disorder. The results of this study provide support for the three factor model of the EDE-Q suggested by Peterson and colleagues (2007) in that this model was appropriate for both the student and non-student sample populations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    Science.gov (United States)

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  12. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    Science.gov (United States)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  13. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  14. Factor Analysis of People Rather than Variables: Q and Other Two-Mode Factor Analytic Models.

    Science.gov (United States)

    Frederick, Brigitte N.

    Factor analysis attempts to study how different objects group together to form factors with the purposes of: (1) reducing the number of factorable entities (e.g., variables) with which the researcher needs to deal; (2) searching data for qualitative and quantitative differences; and (3) testing hypotheses (R. Gorsuch, 1983). While most factor…

  15. Analysis of the three-dimensional tongue shape using a three-index factor analysis model

    Science.gov (United States)

    Zheng, Yanli; Hasegawa-Johnson, Mark; Pizza, Shamala

    2003-01-01

    Three-dimensional tongue shape during vowel production is analyzed using the three-mode PARAFAC (parallel factors) model. Three-dimensional MRI images of five speakers (9 vowels) are analyzed. Sixty-five virtual fleshpoints (13 segments along the rostral-caudal dimension and 5 segments along the right-left direction) are chosen based on the interpolated tongue shape images. Methods used to adjust the alignment of MRI images, to set up the fleshpoints, and to measure the position of the fleshpoints are presented. PARAFAC analysis of this 3D coordinate data results in a stable two-factor solution that explains about 70% of the variance.

  16. Why factor analysis often is the incorrect model for analyzing bipolar concepts, and what model to use instead

    NARCIS (Netherlands)

    van Schuur, Wyijbrandt H.; Kiers, Henk A.L.

    Factor analysis of data that conform to the unfolding model often results in an extra factor. This artificial extra factor is particularly important when data that conform to a bipolar unidimensional unfolding scale are factor analyzed. One bipolar dimension is expected, but two factors are found

  17. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  18. Multilevel Mixture Factor Models

    Science.gov (United States)

    Varriale, Roberta; Vermunt, Jeroen K.

    2012-01-01

    Factor analysis is a statistical method for describing the associations among sets of observed variables in terms of a small number of underlying continuous latent variables. Various authors have proposed multilevel extensions of the factor model for the analysis of data sets with a hierarchical structure. These Multilevel Factor Models (MFMs)…

  19. A latent profile analysis of the Five Factor Model of personality: Modeling trait interactions.

    Science.gov (United States)

    Merz, Erin L; Roesch, Scott C

    2011-12-01

    Interactions among the dimensions of the Five Factor Model (FFM) have not typically been evaluated in mental health research, with the extant literature focusing on bivariate relationships with psychological constructs of interest. This study used latent profile analysis to mimic higher-order interactions to identify homogenous personality profiles using the FFM, and also examined relationships between resultant profiles and affect, self-esteem, depression, anxiety, and coping efficacy. Participants (N = 371) completed self-report and daily diary questionnaires. A 3-profile solution provided the best fit to the data; the profiles were characterized as well-adjusted, reserved, and excitable. The well-adjusted group reported better psychological functioning in validation analyses. The reserved and excitable groups differed on anxiety, with the excitable group reporting generally higher anxiety than the reserved group. Latent profile analysis may be a parsimonious way to model personality heterogeneity.

  20. DETERMINANTS OF SOVEREIGN RATING: FACTOR BASED ORDERED PROBIT MODELS FOR PANEL DATA ANALYSIS MODELING FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Dilek Teker

    2013-01-01

    Full Text Available The aim of this research is to compose a new rating methodology and provide credit notches to 23 countries which of 13 are developed and 10 are emerging. There are various literature that explains the determinants of credit ratings. Following the literature, we select 11 variables for our model which of 5 are eliminated by the factor analysis. We use specific dummies to investigate the structural breaks in time and cross section such as pre crises, post crises, BRIC membership, EU membership, OPEC membership, shipbuilder country and platinum reserved country. Then we run an ordered probit model and give credit notches to the countries. We use FITCH ratings as benchmark. Thus, at the end we compare the notches of FITCH with the ones we derive out of our estimated model.

  1. Entrance and exit region friction factor models for annular seal analysis. Ph.D. Thesis

    Science.gov (United States)

    Elrod, David Alan

    1988-01-01

    The Mach number definition and boundary conditions in Nelson's nominally-centered, annular gas seal analysis are revised. A method is described for determining the wall shear stress characteristics of an annular gas seal experimentally. Two friction factor models are developed for annular seal analysis; one model is based on flat-plate flow theory; the other uses empirical entrance and exit region friction factors. The friction factor predictions of the models are compared to experimental results. Each friction model is used in an annular gas seal analysis. The seal characteristics predicted by the two seal analyses are compared to experimental results and to the predictions of Nelson's analysis. The comparisons are for smooth-rotor seals with smooth and honeycomb stators. The comparisons show that the analysis which uses empirical entrance and exit region shear stress models predicts the static and stability characteristics of annular gas seals better than the other analyses. The analyses predict direct stiffness poorly.

  2. Analysis of Korean Students' International Mobility by 2-D Model: Driving Force Factor and Directional Factor

    Science.gov (United States)

    Park, Elisa L.

    2009-01-01

    The purpose of this study is to understand the dynamics of Korean students' international mobility to study abroad by using the 2-D Model. The first D, "the driving force factor," explains how and what components of the dissatisfaction with domestic higher education perceived by Korean students drives students' outward mobility to seek…

  3. A latent factor linear mixed model for high-dimensional longitudinal data analysis.

    Science.gov (United States)

    An, Xinming; Yang, Qing; Bentler, Peter M

    2013-10-30

    High-dimensional longitudinal data involving latent variables such as depression and anxiety that cannot be quantified directly are often encountered in biomedical and social sciences. Multiple responses are used to characterize these latent quantities, and repeated measures are collected to capture their trends over time. Furthermore, substantive research questions may concern issues such as interrelated trends among latent variables that can only be addressed by modeling them jointly. Although statistical analysis of univariate longitudinal data has been well developed, methods for modeling multivariate high-dimensional longitudinal data are still under development. In this paper, we propose a latent factor linear mixed model (LFLMM) for analyzing this type of data. This model is a combination of the factor analysis and multivariate linear mixed models. Under this modeling framework, we reduced the high-dimensional responses to low-dimensional latent factors by the factor analysis model, and then we used the multivariate linear mixed model to study the longitudinal trends of these latent factors. We developed an expectation-maximization algorithm to estimate the model. We used simulation studies to investigate the computational properties of the expectation-maximization algorithm and compare the LFLMM model with other approaches for high-dimensional longitudinal data analysis. We used a real data example to illustrate the practical usefulness of the model. Copyright © 2013 John Wiley & Sons, Ltd.

  4. A Hierarchical Linear Model with Factor Analysis Structure at Level 2

    Science.gov (United States)

    Miyazaki, Yasuo; Frank, Kenneth A.

    2006-01-01

    In this article the authors develop a model that employs a factor analysis structure at Level 2 of a two-level hierarchical linear model (HLM). The model (HLM2F) imposes a structure on a deficient rank Level 2 covariance matrix [tau], and facilitates estimation of a relatively large [tau] matrix. Maximum likelihood estimators are derived via the…

  5. Verification of the model of predisposition in triathlon – structural model of confirmative factor analysis

    Directory of Open Access Journals (Sweden)

    Lenka Kovářová

    2012-09-01

    Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0

  6. Multilevel Factor Analysis and Structural Equation Modeling of Daily Diary Coping Data: Modeling Trait and State Variation

    Science.gov (United States)

    Roesch, Scott C.; Aldridge, Arianna A.; Stocking, Stephanie N.; Villodas, Feion; Leung, Queenie; Bartley, Carrie E.; Black, Lisa J.

    2010-01-01

    This study used multilevel modeling of daily diary data to model within-person (state) and between-person (trait) components of coping variables. This application included the introduction of multilevel factor analysis (MFA) and a comparison of the predictive ability of these trait/state factors. Daily diary data were collected on a large (n =…

  7. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic a

  8. A receptor model for urban aerosols based on oblique factor analysis

    DEFF Research Database (Denmark)

    Keiding, Kristian; Sørensen, Morten S.; Pind, Niels

    1987-01-01

    A procedure is outlined for the construction of receptor models of urban aerosols, based on factor analysis. The advantage of the procedure is that the covariation of source impacts is included in the construction of the models. The results are compared with results obtained by other receptor-modelling...... procedures. It was found that procedures based on correlating sources were physically sound as well as in mutual agreement. Procedures based on non-correlating sources were found to generate physically obscure models....

  9. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  10. The five-factor model of the Positive and Negative Syndrome Scale - I : Confirmatory factor analysis fails to confirm 25 published five-factor solutions

    NARCIS (Netherlands)

    van der Gaag, Mark; Cuijpers, Anke; Hoffman, Tonko; Remijsen, Mila; Hijman, Ron; de Haan, Lieuwe; van Meijel, Berno; van Harten, Peter N.; Valmaggia, Lucia; de Hert, Marc; Wiersma, Durk

    2006-01-01

    Objective: The aim of this study was to test the goodness-of-fit of all previously published five-factor models of the Positive and Negative Syndrome Scale (PANSS). Methods: We used confirmatory factor analysis (CFA) with a large data set (N = 5769). Results: The different subsamples were tested for

  11. Sensitivity Analysis to Select the Most Influential Risk Factors in a Logistic Regression Model

    Directory of Open Access Journals (Sweden)

    Jassim N. Hussain

    2008-01-01

    Full Text Available The traditional variable selection methods for survival data depend on iteration procedures, and control of this process assumes tuning parameters that are problematic and time consuming, especially if the models are complex and have a large number of risk factors. In this paper, we propose a new method based on the global sensitivity analysis (GSA to select the most influential risk factors. This contributes to simplification of the logistic regression model by excluding the irrelevant risk factors, thus eliminating the need to fit and evaluate a large number of models. Data from medical trials are suggested as a way to test the efficiency and capability of this method and as a way to simplify the model. This leads to construction of an appropriate model. The proposed method ranks the risk factors according to their importance.

  12. Sparse Probabilistic Parallel Factor Analysis for the Modeling of PET and Task-fMRI Data

    DEFF Research Database (Denmark)

    Beliveau, Vincent; Papoutsakis, Georgios; Hinrich, Jesper Løve

    2017-01-01

    interpretability of the results. Here we propose a variational Bayesian parallel factor analysis (VB-PARAFAC) model and an extension with sparse priors (SP-PARAFAC). Notably, our formulation admits time and subject specific noise modeling as well as subject specific offsets (i.e., mean values). We confirmed...... the validity of the models through simulation and performed exploratory analysis of positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) data. Although more constrained, the proposed models performed similarly to more flexible models in approximating the PET data, which supports......Modern datasets are often multiway in nature and can contain patterns common to a mode of the data (e.g. space, time, and subjects). Multiway decomposition such as parallel factor analysis (PARAFAC) take into account the intrinsic structure of the data, and sparse versions of these methods improve...

  13. The application of Global Sensitivity Analysis to quantify the dominant input factors for hydraulic model simulations

    Science.gov (United States)

    Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2015-04-01

    Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum

  14. Factor analysis models for structuring covariance matrices of additive genetic effects: a Bayesian implementation

    Directory of Open Access Journals (Sweden)

    Gianola Daniel

    2007-09-01

    Full Text Available Abstract Multivariate linear models are increasingly important in quantitative genetics. In high dimensional specifications, factor analysis (FA may provide an avenue for structuring (covariance matrices, thus reducing the number of parameters needed for describing (codispersion. We describe how FA can be used to model genetic effects in the context of a multivariate linear mixed model. An orthogonal common factor structure is used to model genetic effects under Gaussian assumption, so that the marginal likelihood is multivariate normal with a structured genetic (covariance matrix. Under standard prior assumptions, all fully conditional distributions have closed form, and samples from the joint posterior distribution can be obtained via Gibbs sampling. The model and the algorithm developed for its Bayesian implementation were used to describe five repeated records of milk yield in dairy cattle, and a one common FA model was compared with a standard multiple trait model. The Bayesian Information Criterion favored the FA model.

  15. Modeling Indicator Systems for Evaluating Environmental Sustainable Development Based on Factor Analysis

    Institute of Scientific and Technical Information of China (English)

    WU Hao; CHEN Xiaoling; HE Ying; HE Xiaorong; CAI Xiaobin; XU Keyan

    2006-01-01

    Indicator systems of environmental sustainable development in the Poyang Lake Basin are established from 51 elementary indexes by factor analysis, which is composed of four steps such as the factor model, the parameter estimation, the factor rotation and the factor score. Under the condition that the cumulative proportion is greater than 85%, 5 explicit factors of environmental sustainable development as well as its factor score by region are carried out. The result indicates some impact factors to the basin environmental in descending sort order are volume of water, volume of waste gas discharge, volume of solid wastes, the degree to comprehensive utilization of waste gas, waste water and solid wastes, the emission volume of waste gas, waste water and solid wastes. It is helpful and important to provide decision support for constituting sustainable development strategies and evaluate the sustainable development status of each city.

  16. Assessing Model Fit: Caveats and Recommendations for Confirmatory Factor Analysis and Exploratory Structural Equation Modeling

    Science.gov (United States)

    Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee

    2015-01-01

    Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…

  17. Multilevel Factor Analysis by Model Segregation: New Applications for Robust Test Statistics

    Science.gov (United States)

    Schweig, Jonathan

    2014-01-01

    Measures of classroom environments have become central to policy efforts that assess school and teacher quality. This has sparked a wide interest in using multilevel factor analysis to test measurement hypotheses about classroom-level variables. One approach partitions the total covariance matrix and tests models separately on the…

  18. Light-Front Quark Model Analysis of Meson-Photon Transition Form Factor

    CERN Document Server

    Choi, Ho-Meoyng

    2016-01-01

    We discuss $(\\pi^0,\\eta,\\eta')\\to\\gamma^*\\gamma$ transition form factors using the light-front quark model. Our discussion includes the analysis of the mixing angles for $\\eta-\\eta'$. Our results for $Q^2 F_{(\\pi^0,\\eta,\\eta')\\to\\gamma^*\\gamma}(Q^2)$ show scaling behavior for high $Q^2$ consistent with pQCD predictions.

  19. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  20. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    Directory of Open Access Journals (Sweden)

    Ye-Mao Xia

    2016-01-01

    Full Text Available Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. This leads to difficulty in dealing with outliers and/or distribution deviations. In this paper, we propose a Bayesian semiparametric modeling for factor analysis model with continuous and ordinal variables. A truncated stick-breaking prior is used to model the distributions of the intercept and/or covariance structural parameters. Bayesian posterior analysis is carried out through the simulation-based method. Blocked Gibbs sampler is implemented to draw observations from the complicated posterior. For model selection, the logarithm of pseudomarginal likelihood is developed to compare the competing models. Empirical results are presented to illustrate the application of the methodology.

  1. Model endophenotype for bipolar disorder: Qualitative Analysis, etiological factors, and research areas

    Directory of Open Access Journals (Sweden)

    Naraiana de Oliveira Tavares

    2014-12-01

    Full Text Available The aim of this study is to present an updated view of the writings on the endophenotype model for bipolar disorder using analytical methodologies. A review and analysis of networks was performed through descriptors and keywords that characterize the composition of the endophenotype model as a model of health. Information was collected from between 1992 and 2014, and the main thematic areas covered in the articles were identified. We discuss the results and question their cohesion, emphasizing the need to strengthen and identify the points of connection between etiological factors and characteristics that make up the model of endophenotypes for bipolar disorder.

  2. Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An Examination of Four Alternative Models

    Directory of Open Access Journals (Sweden)

    Hossein Bevrani, PhD

    2011-09-01

    Full Text Available Objective: The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM, proposed by Earp.Method: The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA was carried out to determine the factor structures of the Persian adaptation of SAM.Results: As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions: Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature.

  3. Reliability Analysis of a Composite Blade Structure Using the Model Correction Factor Method

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian

    2010-01-01

    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...... in a probabilistic sense is model corrected so that it, close to the design point, represents the same structural behaviour as a realistic FE model. This approach leads to considerable improvement of computational efficiency over classical response surface methods, because the numerically “cheap” idealistic model...... is used as the response surface, while the time-consuming detailed model is called only a few times until the simplified model is calibrated to the detailed model....

  4. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Directory of Open Access Journals (Sweden)

    Zhi fang Zhou

    2016-02-01

    Full Text Available Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. Design/methodology/approach: The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. Originality/value: This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. 

  5. Structural equation modeling analysis of factors influencing architects' trust in project design teams

    Institute of Scientific and Technical Information of China (English)

    DING Zhi-kun; NG Fung-fai; WANG Jia-yuan

    2009-01-01

    This paper describes a structural equation modeling (SEM) analysis of factors influencing architects' trust in project design teams. We undertook a survey of architects, during which we distributed 193 questionnaires in 29 A-level architectural We used Amos 6.0 for SEM to identify significant personal construct based factors affecting interpersonal trust. The results show that only social interaction between architects significantly affects their interpersonal trust. The explained variance of trust is not very high in the model. Therefore, future research should add more factors into the current model. The practical implication is that team managers should promote the social interactions between team members such that the interpersonal trust level between team members can be improved.

  6. Modeling and Analysis of Mechanical Quality Factor of the Resonator for Cylinder Vibratory Gyroscope

    Institute of Scientific and Technical Information of China (English)

    XI Xiang; WU Xuezhong; WU Yulie; ZHANG Yongmeng

    2017-01-01

    Mechanical Quality factor(Q factor) of the resonator is an important parameter for the cylinder vibratory gyroscope(CVG).Traditional analytical methods mainly focus on a partial energy loss during the vibration process of the CVG resonator,thus are not accurate for the mechanical Q factor prediction.Therefore an integrated model including air damping loss,surface defect loss,support loss,thermoelastic damping loss and internal friction loss is proposed to obtain the mechanical Q factor of the CVG resonator.Based on structural dynamics and energy dissipation analysis,the contribution of each energy loss to the total mechanical Q factor is quantificationally analyzed.For the resonator with radius ranging from 10 mm to 20 mm,its mechanical Q factor is mainly related to the support loss,thermoelastic damping loss and internal friction loss,which are fundamentally determined by the geometric sizes and material properties of the resonator.In addition,resonators made of alloy 3J53 (Ni42CrTiA1),with different sizes,were experimentally fabricated to test the mechanical Q factor.The theoretical model is well verified by the experimental data,thus provides an effective theoretical method to design and predict the mechanical Q factor of the CVG resonator.

  7. Modeling and analysis of mechanical Quality factor of the resonator for cylinder vibratory gyroscope

    Science.gov (United States)

    Xi, Xiang; Wu, Xuezhong; Wu, Yulie; Zhang, Yongmeng

    2016-08-01

    Mechanical Quality factor(Q factor) of the resonator is an important parameter for the cylinder vibratory gyroscope(CVG). Traditional analytical methods mainly focus on a partial energy loss during the vibration process of the CVG resonator, thus are not accurate for the mechanical Q factor prediction. Therefore an integrated model including air damping loss, surface defect loss, support loss, thermoelastic damping loss and internal friction loss is proposed to obtain the mechanical Q factor of the CVG resonator. Based on structural dynamics and energy dissipation analysis, the contribution of each energy loss to the total mechanical Q factor is quantificationally analyzed. For the resonator with radius ranging from 10 mm to 20 mm, its mechanical Q factor is mainly related to the support loss, thermoelastic damping loss and internal friction loss, which are fundamentally determined by the geometric sizes and material properties of the resonator. In addition, resonators made of alloy 3J53 (Ni42CrTiAl), with different sizes, were experimentally fabricated to test the mechanical Q factor. The theoretical model is well verified by the experimental data, thus provides an effective theoretical method to design and predict the mechanical Q factor of the CVG resonator.

  8. Five-factor model of personality and job satisfaction: a meta-analysis.

    Science.gov (United States)

    Judge, Timothy A; Heller, Daniel; Mount, Michael K

    2002-06-01

    This study reports results of a meta-analysis linking traits from the 5-factor model of personality to overall job satisfaction. Using the model as an organizing framework, 334 correlations from 163 independent samples were classified according to the model. The estimated true score correlations with job satisfaction were -.29 for Neuroticism, .25 for Extraversion, .02 for Openness to Experience, .17 for Agreeableness, and .26 for Conscientiousness. Results further indicated that only the relations of Neuroticism and Extraversion with job satisfaction generalized across studies. As a set, the Big Five traits had a multiple correlation of .41 with job satisfaction, indicating support for the validity of the dispositional source of job satisfaction when traits are organized according to the 5-factor model.

  9. Confirmatory factor analysis, latent profile analysis, and factor mixture modeling of the syndromes of the Child Behavior Checklist and Teacher Report Form.

    Science.gov (United States)

    Gomez, Rapson; Vance, Alasdair

    2014-12-01

    The current study used confirmatory factor analysis (CFA), latent profile analysis (LPA), and factor mixture modeling (FMM) to examine the co-occurrence of the childhood syndromes using the Child Behavior Checklist (CBCL) and Teacher Report Form (TRF). Parents and teachers completed the CBCL and TRF, respectively, for a clinic-referred sample of 720 children, ages 7-12 years. For the CBCL, the analyses indicated most support a 2-class 2-factor FMM, and for the TRF, there was most support for a 2-class 3-factor model. The classes were all syndromes at average levels and all syndromes at high levels. The findings indicate high syndrome co-occurrence. The implications of the findings for understanding syndrome co-occurrence in the CBCL and TRF, theories of syndrome co-occurrence, and the clinical use of the CBCL and TRF are discussed. (c) 2014 APA, all rights reserved.

  10. A factor analysis-multiple regression model for source apportionment of suspended particulate matter

    Science.gov (United States)

    Okamoto, Shin'ichi; Hayashi, Masayuki; Nakajima, Masaomi; Kainuma, Yasutaka; Shiozawa, Kiyoshige

    A factor analysis-multiple regression (FA-MR) model has been used for a source apportionment study in the Tokyo metropolitan area. By a varimax rotated factor analysis, five source types could be identified: refuse incineration, soil and automobile, secondary particles, sea salt and steel mill. Quantitative estimations using the FA-MR model corresponded to the calculated contributing concentrations determined by using a weighted least-squares CMB model. However, the source type of refuse incineration identified by the FA-MR model was similar to that of biomass burning, rather than that produced by an incineration plant. The estimated contributions of sea salt and steel mill by the FA-MR model contained those of other sources, which have the same temporal variation of contributing concentrations. This symptom was caused by a multicollinearity problem. Although this result shows the limitation of the multivariate receptor model, it gives useful information concerning source types and their distribution by comparing with the results of the CMB model. In the Tokyo metropolitan area, the contributions from soil (including road dust), automobile, secondary particles and refuse incineration (biomass burning) were larger than industrial contributions: fuel oil combustion and steel mill. However, since vanadium is highly correlated with SO 42- and other secondary particle related elements, a major portion of secondary particles is considered to be related to fuel oil combustion.

  11. Multi-factor Analysis Model for Improving Profit Management Using Excel in Shellfish Farming Projects

    Institute of Scientific and Technical Information of China (English)

    Zhuming; ZHAO; Changlin; LIU; Xiujuan; SHAN; Jin; YU

    2013-01-01

    By using a farm’s data in Yantai City and the theory of Cost-Volume-Profit analysis and the financial management methods,this paper construct a multi-factor analysis model for improving profit management using Excel 2007 in Shellfish farming projects and describes the procedures to construct a multi-factor analysis model.The model can quickly calculate the profit,improve the level of profit management,find out the breakeven point and enhance the decision-making efficiency of businesses etc.It is also a thought of the application to offer suggestions for government decisions and economic decisions for corporations as a simple analysis tool.While effort has been exerted to construct a four-variable model,some equally important variables may not be discussed sufficiently due to limitation of the paper’s space and the authors’knowledge.All variables can be listed in EXCEL 2007 and can be associated in a logical way to manage the profit of shellfish farming projects more efficiently and more practically.

  12. An analysis of a three-factor model proposed by the Danish Society of Actuaries for forecasting and risk analysis

    DEFF Research Database (Denmark)

    Jørgensen, Peter Løchte; Slipsager, Søren Kærgaard

    2016-01-01

    This paper provides the explicit solution to the three-factor diffusion model recently proposed by the Danish Society of Actuaries to the Danish industry of life insurance and pensions. The solution is obtained by use of the known general solution to multidimensional linear stochastic differential...... well-known risk measures under both schemes. Finally, we conduct a sensitivity analysis and find that the relative performance of the two schemes depends on the chosen model parameter estimates....

  13. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    Science.gov (United States)

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  14. A comparison of ordinal regression models in an analysis of factors associated with periodontal disease

    Directory of Open Access Journals (Sweden)

    Javali Shivalingappa

    2010-01-01

    Full Text Available Aim: The study aimed to determine the factors associated with periodontal disease (different levels of severity by using different regression models for ordinal data. Design: A cross-sectional design was employed using clinical examination and ′questionnaire with interview′ method. Materials and Methods: The study was conducted during June 2008 to October 2008 in Dharwad, Karnataka, India. It involved a systematic random sample of 1760 individuals aged 18-40 years. The periodontal disease examination was conducted by using Community Periodontal Index for Treatment Needs (CPITN. Statistical Analysis Used: Regression models for ordinal data with different built-in link functions were used in determination of factors associated with periodontal disease. Results: The study findings indicated that, the ordinal regression models with four built-in link functions (logit, probit, Clog-log and nlog-log displayed similar results with negligible differences in significant factors associated with periodontal disease. The factors such as religion, caste, sources of drinking water, Timings for sweet consumption, Timings for cleaning or brushing the teeth and materials used for brushing teeth were significantly associated with periodontal disease in all ordinal models. Conclusions: The ordinal regression model with Clog-log is a better fit in determination of significant factors associated with periodontal disease as compared to models with logit, probit and nlog-log built-in link functions. The factors such as caste and time for sweet consumption are negatively associated with periodontal disease. But religion, sources of drinking water, Timings for cleaning or brushing the teeth and materials used for brushing teeth are significantly and positively associated with periodontal disease.

  15. Multivariate sensitivity analysis to measure global contribution of input factors in dynamic models

    Energy Technology Data Exchange (ETDEWEB)

    Lamboni, Matieyendou [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Monod, Herve, E-mail: herve.monod@jouy.inra.f [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Makowski, David [INRA, UMR Agronomie INRA/AgroParisTech (UMR 211), BP 01, F78850 Thiverval-Grignon (France)

    2011-04-15

    Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis. The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.

  16. Factor Analysis and AIC.

    Science.gov (United States)

    Akaike, Hirotugu

    1987-01-01

    The Akaike Information Criterion (AIC) was introduced to extend the method of maximum likelihood to the multimodel situation. Use of the AIC in factor analysis is interesting when it is viewed as the choice of a Bayesian model; thus, wider applications of AIC are possible. (Author/GDC)

  17. Confirmatory Factor Analysis of WAIS-IV in a Clinical Sample: Examining a Bi-Factor Model

    Directory of Open Access Journals (Sweden)

    Rachel Collinson

    2016-12-01

    Full Text Available There have been a number of studies that have examined the factor structure of the Wechsler Adult Intelligence Scale IV (WAIS-IV using the standardization sample. In this study, we investigate its factor structure on a clinical neuropsychology sample of mixed aetiology. Correlated factor, higher-order and bi-factor models are all tested. Overall, the results suggest that the WAIS-IV will be suitable for use with this population.

  18. Confirmatory Factor Analysis of WAIS-IV in a Clinical Sample: Examining a Bi-Factor Model

    OpenAIRE

    Rachel Collinson; Stephen Evans; Miranda Wheeler; Don Brechin; Jenna Moffitt; Geoff Hill; Steven Muncer

    2016-01-01

    There have been a number of studies that have examined the factor structure of the Wechsler Adult Intelligence Scale IV (WAIS-IV) using the standardization sample. In this study, we investigate its factor structure on a clinical neuropsychology sample of mixed aetiology. Correlated factor, higher-order and bi-factor models are all tested. Overall, the results suggest that the WAIS-IV will be suitable for use with this population.

  19. An entrance region friction factor model applied to annular seal analysis - Theory versus experiment for smooth and honeycomb seals

    Science.gov (United States)

    Elrod, D.; Nelson, C.; Childs, D.

    1989-01-01

    A friction factor model is developed for the entrance-region of a duct. The model is used in an annular gas seal analysis similar to Nelson's (1984). Predictions of the analysis are compared to experimental results for a smooth-stator/smooth-rotor seal and three honeycomb-stator/smooth-rotor seals. The model predicts a leakage and direct damping well. The model overpredicts the dependence of cross-coupled stiffness on fluid prerotation. The model predicts direct stiffness poorly.

  20. FACTOR 9.2: A Comprehensive Program for Fitting Exploratory and Semiconfirmatory Factor Analysis and IRT Models

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J.

    2013-01-01

    FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…

  1. The Five-Factor Model personality traits in schizophrenia: A meta-analysis.

    Science.gov (United States)

    Ohi, Kazutaka; Shimada, Takamitsu; Nitta, Yusuke; Kihara, Hiroaki; Okubo, Hiroaki; Uehara, Takashi; Kawasaki, Yasuhiro

    2016-06-30

    Personality is one of important factors in the pathogenesis of schizophrenia because it affects patients' symptoms, cognition and social functioning. Several studies have reported specific personality traits in patients with schizophrenia compared with healthy subjects. However, the results were inconsistent among studies. The NEO Five-Factor Inventory (NEO-FFI) measures five personality traits: Neuroticism (N), Extraversion (E), Openness (O), Agreeableness (A) and Conscientiousness (C). Here, we performed a meta-analysis of these personality traits assessed by the NEO-FFI in 460 patients with schizophrenia and 486 healthy subjects from the published literature and investigated possible associations between schizophrenia and these traits. There was no publication bias for any traits. Because we found evidence of significant heterogeneity in all traits among the studies, we applied a random-effect model to perform the meta-analysis. Patients with schizophrenia showed a higher score for N and lower scores for E, O, A and C compared with healthy subjects. The effect sizes of these personality traits ranged from moderate to large. These differences were not affected by possible moderator factors, such as gender distribution and mean age in each study, expect for gender effect for A. These findings suggest that patients with schizophrenia have a different personality profile compared with healthy subjects.

  2. Model-based analysis of the role of biological, hydrological and geochemical factors affecting uranium bioremediation.

    Science.gov (United States)

    Zhao, Jiao; Scheibe, Timothy D; Mahadevan, R

    2011-07-01

    Uranium contamination is a serious concern at several sites motivating the development of novel treatment strategies such as the Geobacter-mediated reductive immobilization of uranium. However, this bioremediation strategy has not yet been optimized for the sustained uranium removal. While several reactive-transport models have been developed to represent Geobacter-mediated bioremediation of uranium, these models often lack the detailed quantitative description of the microbial process (e.g., biomass build-up in both groundwater and sediments, electron transport system, etc.) and the interaction between biogeochemical and hydrological process. In this study, a novel multi-scale model was developed by integrating our recent model on electron capacitance of Geobacter (Zhao et al., 2010) with a comprehensive simulator of coupled fluid flow, hydrologic transport, heat transfer, and biogeochemical reactions. This mechanistic reactive-transport model accurately reproduces the experimental data for the bioremediation of uranium with acetate amendment. We subsequently performed global sensitivity analysis with the reactive-transport model in order to identify the main sources of prediction uncertainty caused by synergistic effects of biological, geochemical, and hydrological processes. The proposed approach successfully captured significant contributing factors across time and space, thereby improving the structure and parameterization of the comprehensive reactive-transport model. The global sensitivity analysis also provides a potentially useful tool to evaluate uranium bioremediation strategy. The simulations suggest that under difficult environments (e.g., highly contaminated with U(VI) at a high migration rate of solutes), the efficiency of uranium removal can be improved by adding Geobacter species to the contaminated site (bioaugmentation) in conjunction with the addition of electron donor (biostimulation). The simulations also highlight the interactive effect of

  3. Factor analysis using mixed models of multi-environment trials with different levels of unbalancing.

    Science.gov (United States)

    Nuvunga, J J; Oliveira, L A; Pamplona, A K A; Silva, C P; Lima, R R; Balestre, M

    2015-11-13

    This study aimed to analyze the robustness of mixed models for the study of genotype-environment interactions (G x E). Simulated unbalancing of real data was used to determine if the method could predict missing genotypes and select stable genotypes. Data from multi-environment trials containing 55 maize hybrids, collected during the 2005-2006 harvest season, were used in this study. Analyses were performed in two steps: the variance components were estimated by restricted maximum likelihood, using the expectation-maximization (EM) algorithm, and factor analysis (FA) was used to calculate the factor scores and relative position of each genotype in the biplot. Random unbalancing of the data was performed by removing 10, 30, and 50% of the plots; the scores were then re-estimated using the FA model. It was observed that 10, 30, and 50% unbalancing exhibited mean correlation values of 0.7, 0.6, and 0.56, respectively. Overall, the genotypes classified as stable in the biplot had smaller prediction error sum of squares (PRESS) value and prediction amplitude of ellipses. Therefore, our results revealed the applicability of the PRESS statistic to evaluate the performance of stable genotypes in the biplot. This result was confirmed by the sizes of the prediction ellipses, which were smaller for the stable genotypes. Therefore, mixed models can confidently be used to evaluate stability in plant breeding programs, even with highly unbalanced data.

  4. A path analysis model of factors influencing children's requests for unhealthy foods.

    Science.gov (United States)

    Pettigrew, Simone; Jongenelis, Michelle; Miller, Caroline; Chapman, Kathy

    2017-01-01

    Little is known about the complex combination of factors influencing the extent to which children request unhealthy foods from their parents. The aim of this study was to develop a comprehensive model of influencing factors to provide insight into potential methods of reducing these requests. A web panel provider was used to administer a national online survey to a sample of 1302 Australian parent-child dyads (total sample n=2604). Initial univariate analyses identified potential predictors of children's requests for and consumption of unhealthy foods. The identified variables were subsequently incorporated into a path analysis model that included both parents' and children's reports of children's requests for unhealthy foods. The resulting model accounted for a substantial 31% of the variance in parent-reported food request frequency and 27% of the variance in child-reported request frequency. The variable demonstrating the strongest direct association with both parents' and children's reports of request frequency was the frequency of children's current intake of unhealthy foods. Parents' and children's exposure to food advertising and television viewing time were also positively associated with children's unhealthy food requests. The results highlight the need to break the habitual provision of unhealthy foods to avoid a vicious cycle of requests resulting in consumption. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Confirmatory Factor Analysis of the Combined Social Phobia Scale and Social Interaction Anxiety Scale: Support for a Bifactor Model

    Science.gov (United States)

    Gomez, Rapson; Watson, Shaun D.

    2017-01-01

    For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants (N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed. PMID:28210232

  6. Confirmatory Factor Analysis of the Combined Social Phobia Scale and Social Interaction Anxiety Scale: Support for a Bifactor Model.

    Science.gov (United States)

    Gomez, Rapson; Watson, Shaun D

    2017-01-01

    For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants (N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed.

  7. Understanding influential factors on implementing green supply chain management practices: An interpretive structural modelling analysis.

    Science.gov (United States)

    Agi, Maher A N; Nishant, Rohit

    2017-03-01

    In this study, we establish a set of 19 influential factors on the implementation of Green Supply Chain Management (GSCM) practices and analyse the interaction between these factors and their effect on the implementation of GSCM practices using the Interpretive Structural Modelling (ISM) method and the "Matrice d'Impacts Croisés Multiplication Appliquée à un Classement" (MICMAC) analysis on data compiled from interviews with supply chain (SC) executives based in the Gulf countries (Middle East region). The study reveals a strong influence and driving power of the nature of the relationships between SC partners on the implementation of GSCM practices. We especially found that dependence, trust, and durability of the relationship with SC partners have a very high influence. In addition, the size of the company, the top management commitment, the implementation of quality management and the employees training and education exert a critical influence on the implementation of GSCM practices. Contextual elements such as the industry sector and region and their effect on the prominence of specific factors are also highlighted through our study. Finally, implications for research and practice are discussed.

  8. Rasch modeling and confirmatory factor analysis of the systemizing quotient-revised (SQ-R) scale.

    Science.gov (United States)

    Allison, Carrie; Baron-Cohen, Simon; Stone, Mark H; Muncer, Steven J

    2015-01-01

    This study assessed the dimensionality of the Systemizing Quotient-Revised (SQ-R), a measure of how strong a person's interest is in systems, using two statistical approaches: Rasch modeling and Confirmatory Factor Analysis (CFA). Participants included N = 675 with an autism spectrum condition (ASC), N = 1369 family members of people with ASC, and N = 2014 typical controls. Data were applied to the Rasch model (Rating Scale) using WINSTEPS. The data fit the Rasch model quite well lending support to the idea that systemizing could be seen as unidimensional. Reliability estimates were .99 for items and .92 for persons. A CFA parceling approach confirmed that a unidimensional model fit the data. There was, however, differential functioning by sex in some of these items. An abbreviated 44-item version of the scale, consisting of items without differential item functioning by sex was developed. This shorter scale also was tested from a Rasch perspective and confirmed through CFA. All measures showed differences on total scale scores between those participants with and without ASC (d = 0.71, p < .005), and between sexes (d = 0.53, p < .005). We conclude that the SQ-R is an appropriate measure of systemizing which can be measured along a single dimension.

  9. Modeling and analysis of PM2.5 generation for key factors identification in China

    Science.gov (United States)

    Xia, Dehong; Jiang, Binfan; Xie, Yulei

    2016-06-01

    Recently, the PM2.5 pollution in China has occurred frequently and caused widely concern. In order to identify the key factors for PM2.5 generation, the formation characteristics of PM2.5 would be revealed. A property of electric neutrality of PM2.5 was proposed under the least-energy principle and verified through electricity-charge calculation in this paper. It indicated that PM2.5 is formed by the effect of electromagnetic force, including the effect of ionic bond, hydrogen bond and polarization. According to the analysis of interactive forces among different chemical components, a simulation model is developed for describing the random process of PM2.5 generation. In addition, an orthogonal test with two levels and four factors has been designed and carried out through the proposed model. From the text analysis, PM2.5 would be looser and suspend longer in atmosphere due to Organic Compound (OC) existing (OC can reduce about 67% of PM2.5 density). Considering that NH4+ is the only cation in the main chemical components of PM2.5, it would be vital for anions (such as SO42- and NO3-) to aggregate together for facilitating PM2.5 growing. Therefore, in order to relieve PM2.5 pollution, control strategies for OC and NH4+ would be enhanced by government through improving the quality of oils and solvent products, decreasing the amount of nitrogenous fertilizer utilization, or changing the fertilizing environment from dry condition to wet condition.

  10. Analysis of causal relationships by structural equation modeling to determine the factors influencing cognitive function in elderly people in Japan.

    Science.gov (United States)

    Kimura, Daisuke; Nakatani, Ken; Takeda, Tokunori; Fujita, Takashi; Sunahara, Nobuyuki; Inoue, Katsumi; Notoya, Masako

    2015-01-01

    The purpose of this study is to identify a potentiality factor that is a preventive factor for decline in cognitive function. Additionally, this study pursues to clarify the causal relationship between the each potential factor and its influence on cognitive function. Subjects were 366 elderly community residents (mean age 73.7 ± 6.4, male 51, female 315) who participated in the Taketoyo Project from 2007 to 2011. Factor analysis was conducted to identify groupings within mental, social, life, physical and cognitive functions. In order to detect clusters of 14 variables, the item scores were subjected to confirmatory factor analysis. We performed Structural Equation Modeling analysis to calculate the standardization coefficient and correlation coefficient for every factor. The cause and effect hypothesis model was used to gather two intervention theory hypotheses for dementia prevention (direct effect, indirect effect) in one system. Finally, we performed another Structural Equation Modeling analysis to calculate the standardization of the cause and effect hypothesis model. Social participation was found to be activated by the improvement of four factors, and in turn, activated "Social participation" acted on cognitive function.

  11. Analysis of neurotrophic factors in limb and extraocular muscles of mouse model of amyotrophic lateral sclerosis.

    Directory of Open Access Journals (Sweden)

    Vahid M Harandi

    Full Text Available Amyotrophic lateral sclerosis (ALS is currently an incurable fatal motor neuron syndrome characterized by progressive weakness, muscle wasting and death ensuing 3-5 years after diagnosis. Neurotrophic factors (NTFs are known to be important in both nervous system development and maintenance. However, the attempt to translate the potential of NTFs into the therapeutic options remains limited despite substantial number of approaches, which have been tested clinically. Using quantitative RT-PCR (qRT-PCR technique, the present study investigated mRNA expression of four different NTFs: brain-derived neurotrophic factor (BDNF, neurotrophin-3 (NT-3, neurotrophin-4/5 (NT-4 and glial cell line-derived neurotrophic factor (GDNF in limb muscles and extraocular muscles (EOMs from SOD1G93A transgenic mice at early and terminal stages of ALS. General morphological examination revealed that muscle fibres were well preserved in both limb muscles and EOMs in early stage ALS mice. However, in terminal ALS mice, most muscle fibres were either atrophied or hypertrophied in limb muscles but unaffected in EOMs. qRT-PCR analysis showed that in early stage ALS mice, NT-4 was significantly down-regulated in limb muscles whereas NT-3 and GDNF were markedly up-regulated in EOMs. In terminal ALS mice, only GDNF was significantly up-regulated in limb muscles. We concluded that the early down-regulation of NT-4 in limb muscles is closely associated with muscle dystrophy and dysfunction at late stage, whereas the early up-regulations of GDNF and NT-3 in EOMs are closely associated with the relatively well-preserved muscle morphology at late stage. Collectively, the data suggested that comparing NTFs expression between limb muscles and EOMs from different stages of ALS animal models is a useful method in revealing the patho-physiology and progression of ALS, and eventually rescuing motor neuron in ALS patients.

  12. Modeling of Iranian Cheetah Habitat using Ecological Niche Factor Analysis (Case Study: Dare Anjir Wildlife Refuge

    Directory of Open Access Journals (Sweden)

    N. Zamani

    2016-03-01

    Full Text Available Evaluation of habitat sustainability indexes is essential in wildlife management and conservation of rare species. Suitable habitats are required in wildlife managements and conservation also, they increase reproduction and survival rate of species. In this study in order to mapping habitat sustainability and recognizing habitat requirements of Iranian Cheetah (Acinonyx jubatus venaticus, field data from Dare Anjir  wildlife refuge were collected since autumn 2009 until summer 2011. Ecological Niche Factor Analysis approach has been used to develop habitat suitability model. In this method primary maps of  habitat variables including elevation, slope, aspect, vegetation cover, distance from water sources and environmental monitoring stations have been produced by Idrisi and Biomapper software and imported in Biomapper. The output scores obtained from the analysis showed that Iranian cheetah tends to mountain areas where has more topographical features for camouflage in order to hunting, and northern aspects which have more humidity, denser vegetation cover and more preys . Our result showed that the Iranian cheetah has medium niche width and prefer marginal habitats.

  13. Global combustion sources of organic aerosols: model comparison with 84 AMS factor-analysis data sets

    Science.gov (United States)

    Tsimpidi, Alexandra P.; Karydis, Vlassis A.; Pandis, Spyros N.; Lelieveld, Jos

    2016-07-01

    Emissions of organic compounds from biomass, biofuel, and fossil fuel combustion strongly influence the global atmospheric aerosol load. Some of the organics are directly released as primary organic aerosol (POA). Most are emitted in the gas phase and undergo chemical transformations (i.e., oxidation by hydroxyl radical) and form secondary organic aerosol (SOA). In this work we use the global chemistry climate model ECHAM/MESSy Atmospheric Chemistry (EMAC) with a computationally efficient module for the description of organic aerosol (OA) composition and evolution in the atmosphere (ORACLE). The tropospheric burden of open biomass and anthropogenic (fossil and biofuel) combustion particles is estimated to be 0.59 and 0.63 Tg, respectively, accounting for about 30 and 32 % of the total tropospheric OA load. About 30 % of the open biomass burning and 10 % of the anthropogenic combustion aerosols originate from direct particle emissions, whereas the rest is formed in the atmosphere. A comprehensive data set of aerosol mass spectrometer (AMS) measurements along with factor-analysis results from 84 field campaigns across the Northern Hemisphere are used to evaluate the model results. Both the AMS observations and the model results suggest that over urban areas both POA (25-40 %) and SOA (60-75 %) contribute substantially to the overall OA mass, whereas further downwind and in rural areas the POA concentrations decrease substantially and SOA dominates (80-85 %). EMAC does a reasonable job in reproducing POA and SOA levels during most of the year. However, it tends to underpredict POA and SOA concentrations during winter indicating that the model misses wintertime sources of OA (e.g., residential biofuel use) and SOA formation pathways (e.g., multiphase oxidation).

  14. Improved Dynamic Modeling of the Cascade Distillation Subsystem and Analysis of Factors Affecting Its Performance

    Science.gov (United States)

    Perry, Bruce A.; Anderson, Molly S.

    2015-01-01

    The Cascade Distillation Subsystem (CDS) is a rotary multistage distiller being developed to serve as the primary processor for wastewater recovery during long-duration space missions. The CDS could be integrated with a system similar to the International Space Station Water Processor Assembly to form a complete water recovery system for future missions. A preliminary chemical process simulation was previously developed using Aspen Custom Modeler® (ACM), but it could not simulate thermal startup and lacked detailed analysis of several key internal processes, including heat transfer between stages. This paper describes modifications to the ACM simulation of the CDS that improve its capabilities and the accuracy of its predictions. Notably, the modified version can be used to model thermal startup and predicts the total energy consumption of the CDS. The simulation has been validated for both NaC1 solution and pretreated urine feeds and no longer requires retuning when operating parameters change. The simulation was also used to predict how internal processes and operating conditions of the CDS affect its performance. In particular, it is shown that the coefficient of performance of the thermoelectric heat pump used to provide heating and cooling for the CDS is the largest factor in determining CDS efficiency. Intrastage heat transfer affects CDS performance indirectly through effects on the coefficient of performance.

  15. The Application of the Model Correction Factor Method to a Reliability Analysis of a Composite Blade Structure

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian

    2009-01-01

    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...

  16. Moderating Factors of Video-Modeling with Other as Model: A Meta-Analysis of Single-Case Studies

    Science.gov (United States)

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Burke, Mack D.; Camargo, Siglia P.

    2012-01-01

    Video modeling with other as model (VMO) is a more practical method for implementing video-based modeling techniques, such as video self-modeling, which requires significantly more editing. Despite this, identification of contextual factors such as participant characteristics and targeted outcomes that moderate the effectiveness of VMO has not…

  17. Moderating Factors of Video-Modeling with Other as Model: A Meta-Analysis of Single-Case Studies

    Science.gov (United States)

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Burke, Mack D.; Camargo, Siglia P.

    2012-01-01

    Video modeling with other as model (VMO) is a more practical method for implementing video-based modeling techniques, such as video self-modeling, which requires significantly more editing. Despite this, identification of contextual factors such as participant characteristics and targeted outcomes that moderate the effectiveness of VMO has not…

  18. Factor analysis and missing data

    NARCIS (Netherlands)

    Kamakura, WA; Wedel, M

    2000-01-01

    The authors study the estimation of factor models and the imputation of missing data and propose an approach that provides direct estimates of factor weights without the replacement of missing data with imputed values. First, the approach is useful in applications of factor analysis in the presence

  19. Latent factor in Large-Dymensional datasets: forecasting and data analysis using factor models. An application to the insurance sector

    OpenAIRE

    2016-01-01

    The purpose of this thesis is to retrace the main steps that were taken in the evolution of the factor models and, in addition, to introduce two examples of how to apply the newest techniques developed in such fields to two different typologies of dataset, one traditional, meaning that it is composed mainly by macroeconomic and financial time series, and the other one 'new' which includes time series relevant to the Italian insurance sector and a set of macroeconomic and financial series r...

  20. Exploratory structural equation modeling, bifactor models, and standard confirmatory factor analysis models: application to the BASC-2 Behavioral and Emotional Screening System Teacher Form.

    Science.gov (United States)

    Wiesner, Margit; Schanding, G Thomas

    2013-12-01

    Several psychological assessment instruments are based on the assumption of a general construct that is composed of multiple interrelated domains. Standard confirmatory factor analysis is often not well suited for examining the factor structure of such scales. This study used data from 1885 elementary school students (mean age=8.77 years, SD=1.47 years) to examine the factor structure of the Behavioral Assessment System for Children, Second Edition (BASC-2) Behavioral and Emotional Screening System (BESS) Teacher Form that was designed to assess general risk for emotional/behavioral difficulty among children. The modeling sequence included the relatively new exploratory structural equation modeling (ESEM) approach and bifactor models in addition to more standard techniques. Findings revealed that the factor structure of the BASC-2 BESS Teacher Form is multidimensional. Both ESEM and bifactor models showed good fit to the data. Bifactor models were preferred on conceptual grounds. Findings illuminate the hypothesis-generating power of ESEM and suggest that it might not be optimal for instruments designed to assess a predominant general factor underlying the data.

  1. Behavioral analysis of EPA`s MOBILE emission factor model. Discussion paper

    Energy Technology Data Exchange (ETDEWEB)

    Harrington, W.; McConnell, V.D.; Cannon, M.

    1998-06-01

    This report provides a review and assessment of several important aspects of the MOBILE model, EPA`s computer model for estimating emission factors for mobile sources. Inventory models like MOBILE have many uses, but the authors focus primarily on the Model`s role for estimation of emission reduction credits from I/M [Inspection/Maintenance] programs. The authors concentrate on how the model incorporates behavioral responses to I/M regulations. The effectiveness of I/M programs in practice are likely to be strongly influenced by the behavior of motorists, mechanics and even state regulatory authorities. In addition to an examination of model structure and assumptions, the authors examine the empirical data used to calibrate the model, much of which is hard coded and not amenable to change by users. Finally, the authors test the sensitivity of MOBILE results to certain assumptions with behavioral content.

  2. Analysis on influencing factors of clinical teachers’ job satisfaction by structural equation model

    Directory of Open Access Journals (Sweden)

    Haiyi Jia

    2017-02-01

    Full Text Available [Research objective] Analyze the influencing factors of clinical teachers’ job satisfaction. [Research method] The ERG theory was used as the framework to design the questionnaires. Data were analyzed by structural equation model for investigating the influencing factors. [Research result] The modified model shows that factors of existence needs and growth needs have direct influence on the job satisfaction of clinical teachers, the influence coefficients are 0.540 and 0.380. The three influencing factors have positive effects on each other, and the correlation coefficients are 0.620, 0.400 and 0.330 respectively. [Research conclusion] Relevant departments should take active measures to improve job satisfaction of clinical teachers from two aspects: existence needs and growth needs, and to improve their work enthusiasm and teaching quality.

  3. A common spatial factor analysis model for measured neighborhood-level characteristics: The Multi-Ethnic Study of Atherosclerosis.

    Science.gov (United States)

    Nethery, Rachel C; Warren, Joshua L; Herring, Amy H; Moore, Kari A B; Evenson, Kelly R; Diez-Roux, Ana V

    2015-11-01

    The purpose of this study was to reduce the dimensionality of a set of neighborhood-level variables collected on participants in the Multi-Ethnic Study of Atherosclerosis (MESA) while appropriately accounting for the spatial structure of the data. A common spatial factor analysis model in the Bayesian setting was utilized in order to properly characterize dependencies in the data. Results suggest that use of the spatial factor model can result in more precise estimation of factor scores, improved insight into the spatial patterns in the data, and the ability to more accurately assess associations between the neighborhood environment and health outcomes.

  4. A Common Spatial Factor Analysis Model for Measured Neighborhood-Level Characteristics: The Multi-Ethnic Study of Atherosclerosis

    Science.gov (United States)

    Nethery, Rachel C.; Warren, Joshua L.; Herring, Amy H.; Moore, Kari A.B.; Evenson, Kelly R.; Diez-Roux, Ana V.

    2015-01-01

    The purpose of this study was to reduce the dimensionality of a set of neighborhood-level variables collected on participants in the Multi-Ethnic Study of Atherosclerosis (MESA) while appropriately accounting for the spatial structure of the data. A common spatial factor analysis model in the Bayesian setting was utilized in order to properly characterize dependencies in the data. Results suggest that use of the spatial factor model can result in more precise estimation of factor scores, improved insight into the spatial patterns in the data, and the ability to more accurately assess associations between the neighborhood environment and health outcomes. PMID:26372887

  5. A dynamic factor model for the analysis of multivariate time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1985-01-01

    Describes the new statistical technique of dynamic factor analysis (DFA), which accounts for the entire lagged covariance function of an arbitrary 2nd-order stationary time series. DFA is shown to be applicable to a relatively short stretch of observations and is therefore considered worthwhile for

  6. The Five-Factor Model of Personality Traits and Organizational Citizenship Behaviors: A Meta-Analysis

    Science.gov (United States)

    Chiaburu, Dan S.; Oh, In-Sue; Berry, Christopher M.; Li, Ning; Gardner, Richard G.

    2011-01-01

    Using meta-analytic tests based on 87 statistically independent samples, we investigated the relationships between the five-factor model (FFM) of personality traits and organizational citizenship behaviors in both the aggregate and specific forms, including individual-directed, organization-directed, and change-oriented citizenship. We found that…

  7. Bayesian Exploratory and Confirmatory Factor Analysis: Perspectives on Constrained-Model Selection

    NARCIS (Netherlands)

    Peeters, C.F.W.

    2012-01-01

    The dissertation revolves around three aims. The first aim is the construction of a conceptually and computationally simple Bayes factor for Type I constrained-model selection (dimensionality determination) that is determinate under usage of improper priors and the subsequent utilization of this

  8. Linear model analysis of the influencing factors of boar longevity in Southern China.

    Science.gov (United States)

    Wang, Chao; Li, Jia-Lian; Wei, Hong-Kui; Zhou, Yuan-Fei; Jiang, Si-Wen; Peng, Jian

    2017-04-15

    This study aimed to investigate the factors influencing the boar herd life month (BHLM) in Southern China. A total of 1630 records of culling boars from nine artificial insemination centers were collected from January 2013 to May 2016. A logistic regression model and two linear models were used to analyze the effects of breed, housing type, age at herd entry, and seed stock herd on boar removal reason and BHLM, respectively. Boar breed and the age at herd entry had significant effects on the removal reasons (P linear models (with or without removal reason including) showed boars raised individually in stalls exhibited shorter BHLM than those raised in pens (P introduction.

  9. Analysis of factors affecting satisfaction level on problem based learning approach using structural equation modeling

    Science.gov (United States)

    Hussain, Nur Farahin Mee; Zahid, Zalina

    2014-12-01

    Nowadays, in the job market demand, graduates are expected not only to have higher performance in academic but they must also be excellent in soft skill. Problem-Based Learning (PBL) has a number of distinct advantages as a learning method as it can deliver graduates that will be highly prized by industry. This study attempts to determine the satisfaction level of engineering students on the PBL Approach and to evaluate their determinant factors. The Structural Equation Modeling (SEM) was used to investigate how the factors of Good Teaching Scale, Clear Goals, Student Assessment and Levels of Workload affected the student satisfaction towards PBL approach.

  10. A PLS model based on dominant factor for coal analysis using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Feng, Jie; Wang, Zhe; West, Logan; Li, Zheng; Ni, Weidou

    2011-07-01

    Thirty-three bituminous coal samples were utilized to test the application of laser-induced breakdown spectroscopy technique for coal elemental concentration measurement in the air. The heterogeneity of the samples and the pyrolysis or combustion of coal during the laser-sample interaction processes were analyzed to be the main reason for large fluctuation of detected spectra and low calibration quality. Compared with the generally applied normalization with the whole spectral area, normalization with segmental spectral area was found to largely improve the measurement precision and accuracy. The concentrations of major element C in coal were determined by a novel partial least squares (PLS) model based on dominant factor. Dominant C concentration information was taken from the carbon characteristic line intensity since it contains the most-related information, even if not accurately. This dominant factor model was further improved by inducting non-linear relation by partially modeling the inter-element interference effect. The residuals were further corrected by PLS with the full spectrum information. With the physical-principle-based dominant factor to calculate the main quantitative information and to partially explicitly include the non-linear relation, the proposed PLS model avoids the overuse of unrelated noise to some extent and becomes more robust over a wider C concentration range. Results show that RMSEP in the proposed PLS model decreased to 4.47% from 5.52% for the conventional PLS with full spectrum input, while R(2) remained as high as 0.999, and RMSEC&P was reduced from 3.60% to 2.92%, showing the overall improvement of the proposed PLS model.

  11. [Analysis on establishment and affecting factors of qi stagnation and blood stasis rat model].

    Science.gov (United States)

    Wang, Tingting; Jia, Cheng; Chen, Yu; Li, Xin; Cheng, Jiayi

    2012-06-01

    To study on the method for establishing the Qi stagnation and blood stasis rat model and analyze the affecting factors. The orthogonal design was adopted to study the influences of joint stimulations including noise, light, electricity, ice water bath, tail-clamping on model rats. The 'flying spot' method was used to dynamically simulate blood flow velocity in microcirculation. the pressure sensing technology of MOTO was adopted to detect hemorheology-related indicators. And the coagulation method was used to detect blood coagulation-related indicators. Compared with the negative control group, all model groups showed significant reduction in the blood flow velocity in mesenteric microcirculation and increase in the whole blood viscosity at high, medium and low shear rate, the plasma viscosity and the fibrinogen content in four blood coagulation indicators. Noise, light, electricity, tail-clamping, bondage and icewater-bath make significant impact on model rats.

  12. Somatic symptom reports in the general population: Application of a bi-factor model to the analysis of change.

    Science.gov (United States)

    Porsius, Jarry T; Martens, Astrid L; Slottje, Pauline; Claassen, Liesbeth; Korevaar, Joke C; Timmermans, Danielle R M; Vermeulen, Roel; Smid, Tjabe

    2015-11-01

    To investigate the latent structure of somatic symptom reports in the general population with a bi-factor model and apply the structure to the analysis of change in reported symptoms after the emergence of an uncertain environmental health risk. Somatic symptoms were assessed in two general population environmental health cohorts (AMIGO, n=14,829 & POWER, n=951) using the somatization scale of the four-dimensional symptom questionnaire (4DSQ-S). Exploratory bi-factor analysis was used to determine the factor structure in the AMIGO cohort. Multi-group and longitudinal models were applied to assess measurement invariance. For a subsample of residents living close to a newly introduced power line (n=224), we compared a uni- and multidimensional method for the analysis of change in reported symptoms after the power line was put into operation. We found a good fit (RMSEA=0.03, CFI=0.98) for a bi-factor model with one general and three symptom specific factors (musculoskeletal, gastrointestinal, cardiopulmonary). The latent structure was found to be invariant between cohorts and over time. A significant increase (pfactor structure of somatic symptoms reports was equivalent between cohorts and over time. Our findings suggest that taking this structure into account can lead to a more informative interpretation of a change in symptom reports compared to a unidimensional approach. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Loneliness and solitude in adolescence: A confirmatory factor analysis of alternative models

    DEFF Research Database (Denmark)

    Goossens, Luc; Lasgaard, Mathias; Luyckx, Koen

    2009-01-01

    The present study tested a four-factor model of adolescent loneliness and solitude that comprises peer-related loneliness, family loneliness, negative attitude toward solitude, and positive attitude toward solitude. Nine different instruments for a total of 14 scales and derivative subscales were...... of the Loneliness and Aloneness Scale for Children and Adolescents (LACA) is recommended, because the instrument measures all four aspects of the model. Implications for current theories on adolescent loneliness and associated phenomena, such as adolescents' attitude toward being on their own, are briefly discussed....

  14. The Infinite Hierarchical Factor Regression Model

    CERN Document Server

    Rai, Piyush

    2009-01-01

    We propose a nonparametric Bayesian factor regression model that accounts for uncertainty in the number of factors, and the relationship between factors. To accomplish this, we propose a sparse variant of the Indian Buffet Process and couple this with a hierarchical model over factors, based on Kingman's coalescent. We apply this model to two problems (factor analysis and factor regression) in gene-expression data analysis.

  15. A comparison of ordinal regression models in an analysis of factors associated with periodontal disease

    OpenAIRE

    Javali Shivalingappa; Pandit Parameshwar

    2010-01-01

    Aim: The study aimed to determine the factors associated with periodontal disease (different levels of severity) by using different regression models for ordinal data. Design: A cross-sectional design was employed using clinical examination and ′questionnaire with interview′ method. Materials and Methods: The study was conducted during June 2008 to October 2008 in Dharwad, Karnataka, India. It involved a systematic random sample of 1760 individuals aged 18-40 years. The periodon...

  16. [Analysis of dietary pattern and diabetes mellitus influencing factors identified by classification tree model in adults of Fujian].

    Science.gov (United States)

    Yu, F L; Ye, Y; Yan, Y S

    2017-05-10

    Objective: To find out the dietary patterns and explore the relationship between environmental factors (especially dietary patterns) and diabetes mellitus in the adults of Fujian. Methods: Multi-stage sampling method were used to survey residents aged ≥18 years by questionnaire, physical examination and laboratory detection in 10 disease surveillance points in Fujian. Factor analysis was used to identify the dietary patterns, while logistic regression model was applied to analyze relationship between dietary patterns and diabetes mellitus, and classification tree model was adopted to identify the influencing factors for diabetes mellitus. Results: There were four dietary patterns in the population, including meat, plant, high-quality protein, and fried food and beverages patterns. The result of logistic analysis showed that plant pattern, which has higher factor loading of fresh fruit-vegetables and cereal-tubers, was a protective factor for non-diabetes mellitus. The risk of diabetes mellitus in the population at T2 and T3 levels of factor score were 0.727 (95%CI:0.561-0.943) times and 0.736 (95%CI: 0.573-0.944) times higher, respectively, than those whose factor score was in lowest quartile. Thirteen influencing factors and eleven group at high-risk for diabetes mellitus were identified by classification tree model. The influencing factors were dyslipidemia, age, family history of diabetes, hypertension, physical activity, career, sex, sedentary time, abdominal adiposity, BMI, marital status, sleep time and high-quality protein pattern. Conclusion: There is a close association between dietary patterns and diabetes mellitus. It is necessary to promote healthy and reasonable diet, strengthen the monitoring and control of blood lipids, blood pressure and body weight, and have good lifestyle for the prevention and control of diabetes mellitus.

  17. Models and Strategies for Factor Mixture Analysis: An Example Concerning the Structure Underlying Psychological Disorders

    Science.gov (United States)

    Clark, Shaunna L.; Muthén, Bengt; Kaprio, Jaakko; D’Onofrio, Brian M.; Viken, Richard; Rose, Richard J.

    2013-01-01

    The factor mixture model (FMM) uses a hybrid of both categorical and continuous latent variables. The FMM is a good model for the underlying structure of psychopathology because the use of both categorical and continuous latent variables allows the structure to be simultaneously categorical and dimensional. This is useful because both diagnostic class membership and the range of severity within and across diagnostic classes can be modeled concurrently. While the conceptualization of the FMM has been explained in the literature, the use of the FMM is still not prevalent. One reason is that there is little research about how such models should be applied in practice and, once a well fitting model is obtained, how it should be interpreted. In this paper, the FMM will be explored by studying a real data example on conduct disorder. By exploring this example, this paper aims to explain the different formulations of the FMM, the various steps in building a FMM, as well as how to decide between a FMM and alternative models. PMID:24302849

  18. Quantification of source impact to PM using three-dimensional weighted factor model analysis on multi-site data

    Science.gov (United States)

    Shi, Guoliang; Peng, Xing; Huangfu, Yanqi; Wang, Wei; Xu, Jiao; Tian, Yingze; Feng, Yinchang; Ivey, Cesunica E.; Russell, Armistead G.

    2017-07-01

    Source apportionment technologies are used to understand the impacts of important sources of particulate matter (PM) air quality, and are widely used for both scientific studies and air quality management. Generally, receptor models apportion speciated PM data from a single sampling site. With the development of large scale monitoring networks, PM speciation are observed at multiple sites in an urban area. For these situations, the models should account for three factors, or dimensions, of the PM, including the chemical species concentrations, sampling periods and sampling site information, suggesting the potential power of a three-dimensional source apportionment approach. However, the principle of three-dimensional Parallel Factor Analysis (Ordinary PARAFAC) model does not always work well in real environmental situations for multi-site receptor datasets. In this work, a new three-way receptor model, called ;multi-site three way factor analysis; model is proposed to deal with the multi-site receptor datasets. Synthetic datasets were developed and introduced into the new model to test its performance. Average absolute error (AAE, between estimated and true contributions) for extracted sources were all less than 50%. Additionally, three-dimensional ambient datasets from a Chinese mega-city, Chengdu, were analyzed using this new model to assess the application. Four factors are extracted by the multi-site WFA3 model: secondary source have the highest contributions (64.73 and 56.24 μg/m3), followed by vehicular exhaust (30.13 and 33.60 μg/m3), crustal dust (26.12 and 29.99 μg/m3) and coal combustion (10.73 and 14.83 μg/m3). The model was also compared to PMF, with general agreement, though PMF suggested a lower crustal contribution.

  19. An analysis of a three-factor model proposed by the Danish Society of Actuaries for forecasting and risk analysis

    DEFF Research Database (Denmark)

    Jørgensen, Peter Løchte; Slipsager, Søren Kærgaard

    2015-01-01

    This paper provides the explicit solution to the three-factor diffusion model recently proposed by the Danish Society of Actuaries to the Danish industry of life insurance and pensions. The solution is obtained by use of the known general solution to multidimensional linear stochastic differential...

  20. Developing a Model for Agility in Health Humanitarian Supply Chains Using Factor Analysis: Evidence From Iran

    Directory of Open Access Journals (Sweden)

    Jahanbani

    2014-10-01

    Full Text Available Background Since 2000, increase in frequency and severity of natural disasters has necessitate designing an agile relief supply chain to help to affected people. Objectives This study aimed to develop an agile model for supply chains with emphasis on health services in Iranian relief organizations. Materials and Methods This was a descriptive-comparative study. In order to design the conceptual model, agility patterns of supply chains were reviewed in a comparative study. Moreover, a questionnaire was prepared to examine construct (model validation. The validity of questionnaire was confirmed by the judgment of experts and its reliability was ensured by test-retest method with scale of 0.99, and Cronbach's alpha of 0.98. All data were analyzed through factor analyses by SPSS 16 and LISREL 8.53. Results Responsiveness, effectiveness, and flexibility with eigenvalue of 2.628 were identified as the main aspects of agility in humanitarian health supply chains. The components of visibility, reactivity, and speed were identified as components of responsiveness. Quality, reliability, and completeness were identified as components of effectiveness. Volume, delivery, mix, and product were known as components of flexibility. Conclusions Findings confirmed agility dimensions and relationships between them in a model that can be considered as a comprehensive and appropriate model in the establishment, promotion, and evaluation of relief supply chains by policy-makers and authorities of Red Crescent Society and Medical Emergencies.

  1. An exploratory analysis of treatment completion and client and organizational factors using hierarchical linear modeling.

    Science.gov (United States)

    Woodward, Albert; Das, Abhik; Raskin, Ira E; Morgan-Lopez, Antonio A

    2006-11-01

    Data from the Alcohol and Drug Services Study (ADSS) are used to analyze the structure and operation of the substance abuse treatment industry in the United States. Published literature contains little systematic empirical analysis of the interaction between organizational characteristics and treatment outcomes. This paper addresses that deficit. It develops and tests a hierarchical linear model (HLM) to address questions about the empirical relationship between treatment inputs (industry costs, types and use of counseling and medical personnel, diagnosis mix, patient demographics, and the nature and level of services used in substance abuse treatment), and patient outcomes (retention and treatment completion rates). The paper adds to the literature by demonstrating a direct and statistically significant link between treatment completion and the organizational and staffing structure of the treatment setting. Related reimbursement issues, questions for future analysis, and limitations of the ADSS for this analysis are discussed.

  2. Regression analysis in modeling of air surface temperature and factors affecting its value in Peninsular Malaysia

    Science.gov (United States)

    Rajab, Jasim Mohammed; Jafri, Mohd. Zubir Mat; Lim, Hwee San; Abdullah, Khiruddin

    2012-10-01

    This study encompasses air surface temperature (AST) modeling in the lower atmosphere. Data of four atmosphere pollutant gases (CO, O3, CH4, and H2O) dataset, retrieved from the National Aeronautics and Space Administration Atmospheric Infrared Sounder (AIRS), from 2003 to 2008 was employed to develop a model to predict AST value in the Malaysian peninsula using the multiple regression method. For the entire period, the pollutants were highly correlated (R=0.821) with predicted AST. Comparisons among five stations in 2009 showed close agreement between the predicted AST and the observed AST from AIRS, especially in the southwest monsoon (SWM) season, within 1.3 K, and for in situ data, within 1 to 2 K. The validation results of AST with AST from AIRS showed high correlation coefficient (R=0.845 to 0.918), indicating the model's efficiency and accuracy. Statistical analysis in terms of β showed that H2O (0.565 to 1.746) tended to contribute significantly to high AST values during the northeast monsoon season. Generally, these results clearly indicate the advantage of using the satellite AIRS data and a correlation analysis study to investigate the impact of atmospheric greenhouse gases on AST over the Malaysian peninsula. A model was developed that is capable of retrieving the Malaysian peninsulan AST in all weather conditions, with total uncertainties ranging between 1 and 2 K.

  3. The five-factor model of personality and borderline personality disorder: a genetic analysis of comorbidity.

    Science.gov (United States)

    Distel, Marijn A; Trull, Timothy J; Willemsen, Gonneke; Vink, Jacqueline M; Derom, Catherine A; Lynskey, Michael; Martin, Nicholas G; Boomsma, Dorret I

    2009-12-15

    Recently, the nature of personality disorders and their relationship with normal personality traits has received extensive attention. The five-factor model (FFM) of personality, consisting of the personality traits neuroticism, extraversion, openness to experience, agreeableness, and conscientiousness, is one of the proposed models to conceptualize personality disorders as maladaptive variants of continuously distributed personality traits. The present study examined the phenotypic and genetic association between borderline personality and FFM personality traits. Data were available for 4403 monozygotic twins, 4425 dizygotic twins, and 1661 siblings from 6140 Dutch, Belgian, and Australian families. Broad-sense heritability estimates for neuroticism, agreeableness, conscientiousness, extraversion, openness to experience, and borderline personality were 43%, 36%, 43%, 47%, 54%, and 45%, respectively. Phenotypic correlations between borderline personality and the FFM personality traits ranged from .06 for openness to experience to .68 for neuroticism. Multiple regression analyses showed that a combination of high neuroticism and low agreeableness best predicted borderline personality. Multivariate genetic analyses showed the genetic factors that influence individual differences in neuroticism, agreeableness, conscientiousness, and extraversion account for all genetic liability to borderline personality. Unique environmental effects on borderline personality, however, were not completely shared with those for the FFM traits (33% is unique to borderline personality). Borderline personality shares all genetic variation with neuroticism, agreeableness, conscientiousness, and extraversion. The unique environmental influences specific to borderline personality may cause individuals with a specific pattern of personality traits to cross a threshold and develop borderline personality.

  4. Analysis and modelling of the factors controlling seed oil concentration in sunflower: a review

    Directory of Open Access Journals (Sweden)

    Andrianasolo Fety Nambinina

    2016-03-01

    Full Text Available Sunflower appears as a potentially highly competitive crop, thanks to the diversification of its market and the richness of its oil. However, seed oil concentration (OC – a commercial criterion for crushing industry – is subjected to genotypic and environmental effects that make it sometimes hardly predictable. It is assumed that more understanding of oil physiology combined with the use of crop models should permit to improve prediction and management of grain quality for various end-users. Main effects of temperature, water, nitrogen, plant density and fungal diseases were reviewed in this paper. Current generic and specific crop models which simulate oil concentration were found to be empirical and to lack of proper evaluation processes. Recently two modeling approaches integrating ecophysiological knowledge were developed by Andrianasolo (2014, Statistical and dynamic modelling of sunflower (Helianthus annuus L. grain composition as a function of agronomic and environmental factors, Ph.D. Thesis, INP Toulouse: (i a statistical approach relating OC to a range of explanatory variables (potential OC, temperature, water and nitrogen stress indices, intercepted radiation, plant density which resulted in prediction quality from 1.9 to 2.5 oil points depending on the nature of the models; (ii a dynamic approach, based on “source-sink” relationships involving leaves, stems, receptacles (as sources and hulls, proteins and oil (as sinks and using priority rules for carbon and nitrogen allocation. The latter model reproduced dynamic patterns of all source and sink components faithfully, but tended to overestimate OC. A better description of photosynthesis and nitrogen uptake, as well as genotypic parameters is expected to improve its performance.

  5. Model Correction Factor Method

    DEFF Research Database (Denmark)

    Christensen, Claus; Randrup-Thomsen, Søren; Morsing Johannesen, Johannes

    1997-01-01

    The model correction factor method is proposed as an alternative to traditional polynomial based response surface techniques in structural reliability considering a computationally time consuming limit state procedure as a 'black box'. The class of polynomial functions is replaced by a limit...... statebased on an idealized mechanical model to be adapted to the original limit state by the model correction factor. Reliable approximations are obtained by iterative use of gradient information on the original limit state function analogously to previous response surface approaches. However, the strength...... of the model correction factor method, is that in simpler form not using gradient information on the original limit state function or only using this information once, a drastic reduction of the number of limit state evaluation is obtained together with good approximations on the reliability. Methods...

  6. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  7. Comparison of objective Bayes factors for variable selection in parametric regression models for survival analysis.

    Science.gov (United States)

    Cabras, Stefano; Castellanos, Maria Eugenia; Perra, Silvia

    2014-11-20

    This paper considers the problem of selecting a set of regressors when the response variable is distributed according to a specified parametric model and observations are censored. Under a Bayesian perspective, the most widely used tools are Bayes factors (BFs), which are undefined when improper priors are used. In order to overcome this issue, fractional (FBF) and intrinsic (IBF) BFs have become common tools for model selection. Both depend on the size, Nt , of a minimal training sample (MTS), while the IBF also depends on the specific MTS used. In the case of regression with censored data, the definition of an MTS is problematic because only uncensored data allow to turn the improper prior into a proper posterior and also because full exploration of the space of the MTSs, which includes also censored observations, is needed to avoid bias in model selection. To address this concern, a sequential MTS was proposed, but it has the drawback of an increase of the number of possible MTSs as Nt becomes random. For this reason, we explore the behaviour of the FBF, contextualizing its definition to censored data. We show that these are consistent, providing also the corresponding fractional prior. Finally, a large simulation study and an application to real data are used to compare IBF, FBF and the well-known Bayesian information criterion.

  8. Model analysis of the world data on the pion transition form factor

    Energy Technology Data Exchange (ETDEWEB)

    Noguera, S.; Vento, V. [Universidad de Valencia-CSIC, Departamento de Fisica Teorica and Instituto de Fisica Corpuscular, Valencia (Spain)

    2012-10-15

    We discuss the impact of recent Belle data on our description of the pion transition form factor based on the assumption that a perturbative formalism and a nonperturbative one can be matched in a physically acceptable manner at a certain hadronic scale Q{sub 0}. We discuss the implications of the different parameters of the model in comparing with world data and conclude that within experimental errors our description remains valid. Thus we can assert that the low Q{sup 2} nonperturbative description together with an additional 1/Q {sup 2} term at the matching scale have a strong influence on the Q{sup 2} behavior up to very high values of Q{sup 2}. (orig.)

  9. Positive and negative affectivity in children: confirmatory factor analysis of a two-factor model and its relation to symptoms of anxiety and depression.

    Science.gov (United States)

    Lonigan, C J; Hooe, E S; David, C F; Kistner, J A

    1999-06-01

    The positive affect (PA) and negative affect (NA) framework that is embodied in the tripartite model of anxiety and depression has proved useful with adult populations; however, there is as yet little investigation with children concerning either the measurement of PA and NA or the relation between PA and NA and levels of adjustment. A confirmatory factor analysis was used in this study to examine the structure of self-reported affect and its relation to depressive and anxious symptoms in school children (4th to 11th grade). Results supported a 2-factor orthogonal model that was invariant across age and sex. Support for the expected pattern of relations between NA and PA with symptoms of depression and anxiety was strong for the older sample (M = 14.2 years) but weaker for the younger sample (M = 10.3 years). Results also provide preliminary support for the reliability and validity of the Positive and Negative Affect Schedule for children.

  10. Higher Education End-of-Course Evaluations: Assessing the Psychometric Properties Utilizing Exploratory Factor Analysis and Rasch Modeling Approaches

    Directory of Open Access Journals (Sweden)

    Kelly D. Bradley

    2016-07-01

    Full Text Available This paper offers a critical assessment of the psychometric properties of a standard higher education end-of-course evaluation. Using both exploratory factor analysis (EFA and Rasch modeling, the authors investigate the (a an overall assessment of dimensionality using EFA, (b a secondary assessment of dimensionality using a principal components analysis (PCA of the residuals when the items are fit to the Rasch model, and (c an assessment of item-level properties using item-level statistics provided when the items are fit to the Rasch model. The results support the usage of the scale as a supplement to high-stakes decision making such as tenure. However, the lack of precise targeting of item difficulty to person ability combined with the low person separation index renders rank-ordering professors according to minuscule differences in overall subscale scores a highly questionable practice.

  11. Quantifying the importance of spatial resolution and other factors through global sensitivity analysis of a flood inundation model

    Science.gov (United States)

    Thomas Steven Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2016-11-01

    Where high-resolution topographic data are available, modelers are faced with the decision of whether it is better to spend computational resource on resolving topography at finer resolutions or on running more simulations to account for various uncertain input factors (e.g., model parameters). In this paper we apply global sensitivity analysis to explore how influential the choice of spatial resolution is when compared to uncertainties in the Manning's friction coefficient parameters, the inflow hydrograph, and those stemming from the coarsening of topographic data used to produce Digital Elevation Models (DEMs). We apply the hydraulic model LISFLOOD-FP to produce several temporally and spatially variable model outputs that represent different aspects of flood inundation processes, including flood extent, water depth, and time of inundation. We find that the most influential input factor for flood extent predictions changes during the flood event, starting with the inflow hydrograph during the rising limb before switching to the channel friction parameter during peak flood inundation, and finally to the floodplain friction parameter during the drying phase of the flood event. Spatial resolution and uncertainty introduced by resampling topographic data to coarser resolutions are much more important for water depth predictions, which are also sensitive to different input factors spatially and temporally. Our findings indicate that the sensitivity of LISFLOOD-FP predictions is more complex than previously thought. Consequently, the input factors that modelers should prioritize will differ depending on the model output assessed, and the location and time of when and where this output is most relevant.

  12. [Factors influencing long-term survival in patients with nonoperable lung cancer: an analysis by Cox model].

    Science.gov (United States)

    Dong, W; Zhao, W; Sun, L

    1996-09-01

    This paper reports a prospective survey of 173 patients with nonoperable lung cancer between January. 1, 1983 to March. 1, 1985. The follow-up rate was 97.7% over five years. Fourteen factors including sex, age, course of disease before treatment, clinical stage, performance status, size of mass, metastatic status, hemoglobin before treatment, short-term response to treatment and so on which might influence long term survival were studied by univariate analysis (Kruskal-Wallis test for Kaplan-Meier survival curve) and by multivariate analysis (Cox's proportional hazad model and audio-visual chart test for goodness of fit). Multivariate analysis using Cox's model revealed 6 significant prognostic factors: performance status, short-term response to treatment, clinical stage, hemoglobin before treatment, smoking index and method of treatment. The survival prediction equation was chi 2 = 72.14, nu = 6, P < 0.0001. The results indicate that the performance status and the CR rate of the initial treatment, among other things, is the major factors affecting prognosis.

  13. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    Reliability analysis of fiber-reinforced composite structures is a relatively unexplored field, and it is therefore expected that engineers and researchers trying to apply such an approach will meet certain challenges until more knowledge is accumulated. While doing the analyses included...... in the present paper, the authors have experienced some of the possible pitfalls on the way to complete a precise and robust reliability analysis for layered composites. Results showed that in order to obtain accurate reliability estimates it is necessary to account for the various failure modes described...... by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  14. Topographic factor analysis: a Bayesian model for inferring brain networks from neural data.

    Directory of Open Access Journals (Sweden)

    Jeremy R Manning

    Full Text Available The neural patterns recorded during a neuroscientific experiment reflect complex interactions between many brain regions, each comprising millions of neurons. However, the measurements themselves are typically abstracted from that underlying structure. For example, functional magnetic resonance imaging (fMRI datasets comprise a time series of three-dimensional images, where each voxel in an image (roughly reflects the activity of the brain structure(s-located at the corresponding point in space-at the time the image was collected. FMRI data often exhibit strong spatial correlations, whereby nearby voxels behave similarly over time as the underlying brain structure modulates its activity. Here we develop topographic factor analysis (TFA, a technique that exploits spatial correlations in fMRI data to recover the underlying structure that the images reflect. Specifically, TFA casts each brain image as a weighted sum of spatial functions. The parameters of those spatial functions, which may be learned by applying TFA to an fMRI dataset, reveal the locations and sizes of the brain structures activated while the data were collected, as well as the interactions between those structures.

  15. Analysis of Tumor Necrosis Factor Function Using the Resonant Recognition Model.

    Science.gov (United States)

    Cosic, Irena; Cosic, Drasko; Lazar, Katarina

    2016-06-01

    The tumor necrosis factor (TNF) is a complex protein that plays a very important role in a number of biological functions including apoptotic cell death, tumor regression, cachexia, inflammation inhibition of tumorigenesis and viral replication. Its most interesting function is that it is an inhibitor of tumorigenesis and inductor of apoptosis. Thus, the TNF could be a good candidate for cancer therapy. However, the TNF has also inflammatory and toxic effects. Therefore, it would be very important to understand complex functions of the TNF and consequently be able to predict mutations or even design the new TNF-related proteins that will have only a tumor inhibition function, but not other side effects. This can be achieved by applying the resonant recognition model (RRM), a unique computational model of analysing macromolecular sequences of proteins, DNA and RNA. The RRM is based on finding that certain periodicities in distribution of free electron energies along protein, DNA and RNA are strongly correlated to the biological function of these macromolecules. Thus, based on these findings, the RRM has capabilities of protein function identification, prediction of bioactive amino acids and protein design with desired biological function. Using the RRM, we separate different functions of TNF as different periodicities (frequencies) within the distribution of free energy electrons along TNF protein. Interestingly, these characteristic TNF frequencies are related to previously identified characteristics of proto-oncogene and oncogene proteins describing TNF involvement in oncogenesis. Consequently, we identify the key amino acids related to the crucial TNF function, i.e. receptor recognition. We have also designed the peptide which will have the ability to recognise the receptor without side effects.

  16. Multilevel modeling versus cross-sectional analysis for assessing the longitudinal tracking of cardiovascular risk factors over time.

    Science.gov (United States)

    Xanthakis, Vanessa; Sullivan, Lisa M; Vasan, Ramachandran S

    2013-12-10

    Correlated data are obtained in longitudinal epidemiological studies, where repeated measurements are taken on individuals or groups over time. Such longitudinal data are ideally analyzed using multilevel modeling approaches, which appropriately account for the correlations in repeated responses in the same individual. Commonly used regression models are inappropriate as they assume that measurements are independent. In this tutorial, we use multilevel modeling to demonstrate its use for analysis of correlated data obtained from serial examinations on individuals. We focus on cardiovascular epidemiological research where investigators are often interested in quantifying the relations between clinical risk factors and outcome measures (X and Y, respectively), where X and Y are measured repeatedly over time, for example, using serial observations on participants attending multiple examinations in a longitudinal cohort study. For instance, it may be of interest to evaluate the relations between serial measures of left ventricular mass (outcome) and of its potential determinants (i.e., body mass index and blood pressure), both of which are measured over time. In this tutorial, we describe the application of multilevel modeling to cardiovascular risk factors and outcome data (using serial echocardiographic data as an example of an outcome). We suggest an analytical approach that can be implemented to evaluate relations between any potential outcome of interest and risk factors, including assessment of random effects and nonlinear relations. We illustrate these steps using echocardiographic data from the Framingham Heart Study with SAS PROC MIXED.

  17. A Two-Factor Model of Temperament

    OpenAIRE

    Evans, David E.; Rothbart, Mary K.

    2009-01-01

    The higher order structure of temperament was examined in two studies using the Adult Temperament Questionnaire. Because previous research showed robust levels of convergence between Rothbart’s constructs of temperament and the Big Five factors, we hypothesized a higher order two-factor model of temperament based on Digman’s higher order two-factor model of personality traits derived from factor analysis of the Big Five factors. Study 1 included 258 undergraduates. Digman’s model did not fit ...

  18. Cutaneous leishmaniasis prevalence and morbidity based on environmental factors in Ilam, Iran: Spatial analysis and land use regression models.

    Science.gov (United States)

    Mokhtari, Mehdi; Miri, Mohammad; Nikoonahad, Ali; Jalilian, Ali; Naserifar, Razi; Ghaffari, Hamid Reza; Kazembeigi, Farogh

    2016-11-01

    The aim of this study was to investigate the impact of the environmental factors on cutaneous leishmaniasis (CL) prevalence and morbidity in Ilam province, western Iran, as a known endemic area for this disease. Accurate locations of 3237 CL patients diagnosed from 2013 to 2015, their demographic information, and data of 17 potentially predictive environmental variables (PPEVs) were prepared to be used in Geographic Information System (GIS) and Land-Use Regression (LUR) analysis. The prevalence, risk, and predictive risk maps were provided using Inverse Distance Weighting (IDW) model in GIS software. Regression analysis was used to determine how environmental variables affect on CL prevalence. All maps and regression models were developed based on the annual and three-year average of the CL prevalence. The results showed that there was statistically significant relationship (P value≤0.05) between CL prevalence and 11 (64%) PPEVs which were elevation, population, rainfall, temperature, urban land use, poorland, dry farming, inceptisol and aridisol soils, and forest and irrigated lands. The highest probability of the CL prevalence was predicted in the west of the study area and frontier with Iraq. An inverse relationship was found between CL prevalence and environmental factors, including elevation, covering soil, rainfall, agricultural irrigation, and elevation while this relation was positive for temperature, urban land use, and population density. Environmental factors were found to be an important predictive variables for CL prevalence and should be considered in management strategies for CL control.

  19. A Bayesian Belief Network modelling of organisational factors in risk analysis: A case study in maritime transportation

    Energy Technology Data Exchange (ETDEWEB)

    Trucco, P. [Department of Management, Economics and Industrial Engineering-Politecnico di Milano, Piazza Leonardo da Vinci, 32, I-20133 Milan (Italy)], E-mail: paolo.trucco@polimi.it; Cagno, E. [Department of Management, Economics and Industrial Engineering-Politecnico di Milano, Piazza Leonardo da Vinci, 32, I-20133 Milan (Italy); Ruggeri, F. [CNR IMATI, via E.Bassini, 15, I-20133 Milan (Italy); Grande, O. [Department of Management, Economics and Industrial Engineering-Politecnico di Milano, Piazza Leonardo da Vinci, 32, I-20133 Milan (Italy)

    2008-06-15

    The paper presents an innovative approach to integrate Human and Organisational Factors (HOF) into risk analysis. The approach has been developed and applied to a case study in the maritime industry, but it can also be utilised in other sectors. A Bayesian Belief Network (BBN) has been developed to model the Maritime Transport System (MTS), by taking into account its different actors (i.e., ship-owner, shipyard, port and regulator) and their mutual influences. The latter have been modelled by means of a set of dependent variables whose combinations express the relevant functions performed by each actor. The BBN model of the MTS has been used in a case study for the quantification of HOF in the risk analysis carried out at the preliminary design stage of High Speed Craft (HSC). The study has focused on a collision in open sea hazard carried out by means of an original method of integration of a Fault Tree Analysis (FTA) of technical elements with a BBN model of the influences of organisational functions and regulations, as suggested by the International Maritime Organisation's (IMO) Guidelines for Formal Safety Assessment (FSA). The approach has allowed the identification of probabilistic correlations between the basic events of a collision accident and the BBN model of the operational and organisational conditions. The linkage can be exploited in different ways, especially to support identification and evaluation of risk control options also at the organisational level. Conditional probabilities for the BBN have been estimated by means of experts' judgments, collected from an international panel of different European countries. Finally, a sensitivity analysis has been carried out over the model to identify configurations of the MTS leading to a significant reduction of accident probability during the operation of the HSC.

  20. Best-fit model of exploratory and confirmatory factor analysis of the 2010 Medical Council of Canada Qualifying Examination Part I clinical decision-making cases.

    Science.gov (United States)

    Champlain, André F De

    2015-01-01

    This study aims to assess the fit of a number of exploratory and confirmatory factor analysis models to the 2010 Medical Council of Canada Qualifying Examination Part I (MCCQE1) clinical decision-making (CDM) cases. The outcomes of this study have important implications for a range of domains, including scoring and test development. The examinees included all first-time Canadian medical graduates and international medical graduates who took the MCCQE1 in spring or fall 2010. The fit of one- to five-factor exploratory models was assessed for the item response matrix of the 2010 CDM cases. Five confirmatory factor analytic models were also examined with the same CDM response matrix. The structural equation modeling software program Mplus was used for all analyses. Out of the five exploratory factor analytic models that were evaluated, a three-factor model provided the best fit. Factor 1 loaded on three medicine cases, two obstetrics and gynecology cases, and two orthopedic surgery cases. Factor 2 corresponded to pediatrics, and the third factor loaded on psychiatry cases. Among the five confirmatory factor analysis models examined in this study, three- and four-factor lifespan period models and the five-factor discipline models provided the best fit. The results suggest that knowledge of broad disciplinary domains best account for performance on CDM cases. In test development, particular effort should be placed on developing CDM cases according to broad discipline and patient age domains; CDM testlets should be assembled largely using the criteria of discipline and age.

  1. Convergence, Admissibility, and Fit of Alternative Confirmatory Factor Analysis Models for MTMM Data

    Science.gov (United States)

    Lance, Charles E.; Fan, Yi

    2016-01-01

    We compared six different analytic models for multitrait-multimethod (MTMM) data in terms of convergence, admissibility, and model fit to 258 samples of previously reported data. Two well-known models, the correlated trait-correlated method (CTCM) and the correlated trait-correlated uniqueness (CTCU) models, were fit for reference purposes in…

  2. Convergence, Admissibility, and Fit of Alternative Confirmatory Factor Analysis Models for MTMM Data

    Science.gov (United States)

    Lance, Charles E.; Fan, Yi

    2016-01-01

    We compared six different analytic models for multitrait-multimethod (MTMM) data in terms of convergence, admissibility, and model fit to 258 samples of previously reported data. Two well-known models, the correlated trait-correlated method (CTCM) and the correlated trait-correlated uniqueness (CTCU) models, were fit for reference purposes in…

  3. Latent variable models an introduction to factor, path, and structural equation analysis

    CERN Document Server

    Loehlin, John C

    2004-01-01

    This fourth edition introduces multiple-latent variable models by utilizing path diagrams to explain the underlying relationships in the models. The book is intended for advanced students and researchers in the areas of social, educational, clinical, ind

  4. Comparative Analysis of the Effects of Neurotrophic Factors CDNF and GDNF in a Nonhuman Primate Model of Parkinson's Disease.

    Directory of Open Access Journals (Sweden)

    Enrique Garea-Rodríguez

    Full Text Available Cerebral dopamine neurotrophic factor (CDNF belongs to a newly discovered family of evolutionarily conserved neurotrophic factors. We demonstrate for the first time a therapeutic effect of CDNF in a unilateral 6-hydroxydopamine (6-OHDA lesion model of Parkinson's disease in marmoset monkeys. Furthermore, we tested the impact of high chronic doses of human recombinant CDNF on unlesioned monkeys and analyzed the amino acid sequence of marmoset CDNF. The severity of 6-OHDA lesions and treatment effects were monitored in vivo using 123I-FP-CIT (DaTSCAN SPECT. Quantitative analysis of 123I-FP-CIT SPECT showed a significant increase of dopamine transporter binding activity in lesioned animals treated with CDNF. Glial cell line-derived neurotrophic factor (GDNF, a well-characterized and potent neurotrophic factor for dopamine neurons, served as a control in a parallel comparison with CDNF. By contrast with CDNF, only single animals responded to the treatment with GDNF, but no statistical difference was observed in the GDNF group. However, increased numbers of tyrosine hydroxylase immunoreactive neurons, observed within the lesioned caudate nucleus of GDNF-treated animals, indicate a strong bioactive potential of GDNF.

  5. Human Factors Model

    Science.gov (United States)

    1993-01-01

    Jack is an advanced human factors software package that provides a three dimensional model for predicting how a human will interact with a given system or environment. It can be used for a broad range of computer-aided design applications. Jack was developed by the computer Graphics Research Laboratory of the University of Pennsylvania with assistance from NASA's Johnson Space Center, Ames Research Center and the Army. It is the University's first commercial product. Jack is still used for academic purposes at the University of Pennsylvania. Commercial rights were given to Transom Technologies, Inc.

  6. Linking Socioeconomic Status to Social Cognitive Career Theory Factors: A Partial Least Squares Path Modeling Analysis

    Science.gov (United States)

    Huang, Jie-Tsuen; Hsieh, Hui-Hsien

    2011-01-01

    The purpose of this study was to investigate the contributions of socioeconomic status (SES) in predicting social cognitive career theory (SCCT) factors. Data were collected from 738 college students in Taiwan. The results of the partial least squares (PLS) analyses indicated that SES significantly predicted career decision self-efficacy (CDSE);…

  7. Identification and analysis of explanatory variables for a multi-factor productivity model of passenger airlines

    Directory of Open Access Journals (Sweden)

    Antonio Henriques de Araújo Jr

    2011-05-01

    Full Text Available The paper aimed to identify and analyze the explanatory variables for airlines productivity during 2000 2005, by testing the Pearson correlation between the single factor productivity capital, energy and labor of a sample of 45 selected international airlines (4 Brazilian carriers among them and their productivity explanatory variables like medium stage length, aircraft load factor, hours flown and cruise speed for selected routes besides aircraft seat configuration and airlines number of employees. The research demonstrated, that a set of variables can explain differences in productivity for passenger airlines, such as: investment in personnel training processes, automation, airplane seat density, occupation of aircraft, average flight stage length, density and extension of routes, among others.

  8. Organic aerosol concentration and composition over Europe: insights from comparison of regional model predictions with aerosol mass spectrometer factor analysis

    Directory of Open Access Journals (Sweden)

    C. Fountoukis

    2014-03-01

    Full Text Available A detailed three-dimensional regional chemical transport model (PMCAMx was applied over Europe focusing on the formation and chemical transformation of organic matter. Three periods representative of different seasons were simulated, corresponding to intensive field campaigns. An extensive set of AMS measurements was used to evaluate the model and, using factor analysis results, gain more insight into the sources and transformations of organic aerosol (OA. Overall, the agreement between predictions and measurements for OA concentration is encouraging with the model reproducing two thirds of the data (daily average mass concentrations within a factor of two. Oxygenated OA (OOA is predicted to contribute 93% to total OA during May, 87% during winter and 96% during autumn with the rest consisting of fresh primary OA (POA. Predicted OOA concentrations compare well with the observed OOA values for all periods with an average fractional error of 0.53 and a bias equal to −0.07 (mean error = 0.9 μg m−3, mean bias = −0.2 μg m−3. The model systematically underpredicts fresh POA in most sites during late spring and autumn (mean bias up to −0.8 μg m−3. Based on results from a source apportionment algorithm running in parallel with PMCAMx, most of the POA originates from biomass burning (fires and residential wood combustion and therefore biomass burning OA is most likely underestimated in the emission inventory. The model performs well at all sites when the PMF-estimated low volatility OOA is compared against the OA with C* ≤ 0.1 μg m−3 and semivolatile OOA against the OA with C* > 0.1 μg m−3 respectively.

  9. Modeling and Finite Element Analysis of Load-Carrying Performance of a Wind Turbine Considering the Influence of Assembly Factors

    Directory of Open Access Journals (Sweden)

    Jianmei Wang

    2017-03-01

    Full Text Available In this work, a wind turbine shrink disk is used as the research object to investigate load-carrying performance of a multi-layer interference fit, and the theoretical model and finite element model are constructed. According to those models, a MW-level turbine shrink disk is designed, and a test device is developed to apply torque to this turbine shrink disk by hydraulic jack. Then, the circumferential slip between the contact surfaces is monitored and the slip of all contact surfaces is zero. This conclusion verifies the reasonability of the proposed models. The effect of the key influencing factors, such as machining deviation, assembly clearance and propel stroke, were analyzed. The contact pressure and load torque of the mating surfaces were obtained by building typical models with different parameters using finite element analysis (FEA. The results show that the minimum assembly clearance and the machining deviation within the machining range have little influence on load-carrying performance of multi-layer interference fit, while having a greater influence on the maximum assembly clearance and the propel stroke. The results also show that the load-carrying performance of a multiple-layer interference fit can be ensured only if the key factors are set within a reasonable design range. To avoid the abnormal operation of equipment caused by insufficient load torque, the propel stroke during practical assembly should be at least 0.95 times the designed propel stroke, which is significant in guiding the design and assembly of the multi-layer interference fit.

  10. A general psychopathology factor (P factor) in children: Structural model analysis and external validation through familial risk and child global executive function.

    Science.gov (United States)

    Martel, Michelle M; Pan, Pedro M; Hoffmann, Maurício S; Gadelha, Ary; do Rosário, Maria C; Mari, Jair J; Manfro, Gisele G; Miguel, Eurípedes C; Paus, Tomás; Bressan, Rodrigo A; Rohde, Luis A; Salum, Giovanni A

    2017-01-01

    High rates of comorbidities and poor validity of disorder diagnostic criteria for mental disorders hamper advances in mental health research. Recent work has suggested the utility of continuous cross-cutting dimensions, including general psychopathology and specific factors of externalizing and internalizing (e.g., distress and fear) syndromes. The current study evaluated the reliability of competing structural models of psychopathology and examined external validity of the best fitting model on the basis of family risk and child global executive function (EF). A community sample of 8,012 families from Brazil with children ages 6-12 years completed structured interviews about the child and parental psychiatric syndromes, and a subsample of 2,395 children completed tasks assessing EF (i.e., working memory, inhibitory control, and time processing). Confirmatory factor analyses tested a series of structural models of psychopathology in both parents and children. The model with a general psychopathology factor ("P factor") with 3 specific factors (fear, distress, and externalizing) exhibited the best fit. The general P factor accounted for most of the variance in all models, with little residual variance explained by each of the 3 specific factors. In addition, associations between child and parental factors were mainly significant for the P factors and nonsignificant for the specific factors from the respective models. Likewise, the child P factor-but not the specific factors-was significantly associated with global child EF. Overall, our results provide support for a latent overarching P factor characterizing child psychopathology, supported by familial associations and child EF. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Developing a Model for Agility in Health Humanitarian Supply Chains Using Factor Analysis: Evidence From Iran

    OpenAIRE

    Jahanbani; Nasiripour; Raeissi; Tabibi

    2014-01-01

    Background Since 2000, increase in frequency and severity of natural disasters has necessitate designing an agile relief supply chain to help to affected people. Objectives This study aimed to develop an agile model for supply chains with emphasis on health services in Iranian relief organizations. Materials and Methods This was a descriptive-comparative study. In order to design t...

  12. Five-factor model personality traits and inflammatory markers: new data and a meta-analysis.

    Science.gov (United States)

    Luchetti, Martina; Barkley, James M; Stephan, Yannick; Terracciano, Antonio; Sutin, Angelina R

    2014-12-01

    The purpose of this research is to examine the association between five major dimensions of personality and systemic inflammation through (a) new data on C-reactive protein (CRP) from three large national samples of adults that together cover most of the adult lifespan and (b) a meta-analysis of published studies on CRP and interleukin-6 (IL-6). New data (total N=26,305) were drawn from the National Longitudinal Study of Adolescent Health, the Midlife in the United States study, and the Health and Retirement Study. PRISMA guidelines were used for the meta-analysis to combine results of up to seven studies on CRP (N=34,067) and six on IL-6 (N=7538). Across the three new samples, higher conscientiousness was associated with lower CRP. The conscientiousness-CRP relation was virtually identical controlling for smoking; controlling for body mass index attenuated this association but did not eliminate it. Compared to participants in the highest quartile of conscientiousness, participants in the lowest quartile had an up to 50% increased risk of CRP levels that exceeded the clinical threshold (≥3 mg/l). The meta-analysis supported the association between conscientiousness and both CRP and IL-6 and also suggested a negative association between openness and CRP; no associations were found for neuroticism, extraversion and agreeableness. The present work indicates a modest, but consistent, association between conscientiousness and a more favorable inflammatory profile, which may contribute to the role of conscientiousness in better health across the lifespan.

  13. Structural Analysis of the Factors Influencing the Financing of Forestry Enterprises Based on Interpretive Structural Modeling(ISM)

    Institute of Scientific and Technical Information of China (English)

    Zhen; WANG; Weiping; LIU; Xiaomin; JIANG

    2015-01-01

    Through the collection of related literature,we point out the six major factors influencing China’s forestry enterprises’ financing: insufficient national support; regulations and institutional environmental factors; narrow channels of financing; inappropriate existing mortgagebacked approach; forestry production characteristics; forestry enterprises’ defects. Then,we use interpretive structural modeling( ISM) from System Engineering to analyze the structure of the six factors and set up ladder-type structure. We put three factors including forestry production characteristics,shortcomings of forestry enterprises and regulatory,institutional and environmental factors as basic factors and put other three factors as important factors. From the perspective of the government and enterprises,we put forward some personal advices and ideas based on the basic factors and important factors to ease the financing difficulties of forestry enterprises.

  14. Factors influencing adoption of farm management practices in three agrobiodiversity hotspots in India: an analysis using the Count Data Model

    Directory of Open Access Journals (Sweden)

    Prabhakaran T. Raghu

    2014-07-01

    Full Text Available Sustainable agricultural practices require, among other factors, adoption of improved nutrient management techniques, pest mitigation technology and soil conservation measures. Such improved management practices can be tools for enhancing crop productivity. Data on micro-level farm management practices from developing countries is either scarce or unavailable, despite the importance of their policy implications with regard to resource allocation. The present study investigates adoption of some farm management practices and factors influencing the adoption behavior of farm households in three agrobiodiversity hotspots in India: Kundra block in the Koraput district of Odisha, Meenangadi panchayat in the Wayanad district of Kerala and Kolli Hills in the Namakkal district of Tamil Nadu. Information on farm management practices was collected from November 2011 to February 2012 from 3845 households, of which the data from 2726 farm households was used for analysis. The three most popular farm management practices adopted by farmers include: application of chemical fertilizers, farm yard manure and green manure for managing nutrients; application of chemical pesticides, inter-cropping and mixed cropping for mitigating pests; and contour bunds, grass bunds and trenches for soil conservation. A Negative Binomial count data regression model was used to estimate factors influencing decision-making by farmers on farm management practices. The regression results indicate that farmers who received information from agricultural extension are statistically significant and positively related to the adoption of farm management practices. Another key finding shows the negative relationship between cultivation of local varieties and adoption of farm management practices.

  15. Dissecting mechanisms of mouse embryonic stem cells heterogeneity through a model-based analysis of transcription factor dynamics.

    Science.gov (United States)

    Herberg, Maria; Glauche, Ingmar; Zerjatke, Thomas; Winzi, Maria; Buchholz, Frank; Roeder, Ingo

    2016-04-01

    Pluripotent mouse embryonic stem cells (mESCs) show heterogeneous expression levels of transcription factors (TFs) involved in pluripotency regulation, among them Nanog and Rex1. The expression of both TFs can change dynamically between states of high and low activity, correlating with the cells' capacity for self-renewal. Stochastic fluctuations as well as sustained oscillations in gene expression are possible mechanisms to explain this behaviour, but the lack of suitable data hampered their clear distinction. Here, we present a systems biology approach in which novel experimental data on TF heterogeneity is complemented by an agent-based model of mESC self-renewal. Because the model accounts for intracellular interactions, cell divisions and heredity structures, it allows for evaluating the consistency of the proposed mechanisms with data on population growth and on TF dynamics after cell sorting. Our model-based analysis revealed that a bistable, noise-driven network model fulfils the minimal requirements to consistently explain Nanog and Rex1 expression dynamics in heterogeneous and sorted mESC populations. Moreover, we studied the impact of TF-related proliferation capacities on the frequency of state transitions and demonstrate that cellular genealogies can provide insights into the heredity structures of mESCs.

  16. Assessing the discriminating power of item and test scores in the linear factor-analysis model

    Directory of Open Access Journals (Sweden)

    Pere J. Ferrando

    2012-01-01

    Full Text Available Las propuestas rigurosas y basadas en un modelo psicométrico para estudiar el impreciso concepto de "capacidad discriminativa" son escasas y generalmente limitadas a los modelos no-lineales para items binarios. En este artículo se propone un marco general para evaluar la capacidad discriminativa de las puntuaciones en ítems y tests que son calibrados mediante el modelo de un factor común. La propuesta se organiza en torno a tres criterios: (a tipo de puntuación, (b rango de discriminación y (c aspecto específico que se evalúa. Dentro del marco propuesto: (a se discuten las relaciones entre 16 medidas, de las cuales 6 parecen ser nuevas, y (b se estudian las relaciones entre ellas. La utilidad de la propuesta en las aplicaciones psicométricas que usan el modelo factorial se ilustra mediante un ejemplo empírico.

  17. Computational Modelling Approaches on Epigenetic Factors in Neurodegenerative and Autoimmune Diseases and Their Mechanistic Analysis

    Directory of Open Access Journals (Sweden)

    Afroza Khanam Irin

    2015-01-01

    Full Text Available Neurodegenerative as well as autoimmune diseases have unclear aetiologies, but an increasing number of evidences report for a combination of genetic and epigenetic alterations that predispose for the development of disease. This review examines the major milestones in epigenetics research in the context of diseases and various computational approaches developed in the last decades to unravel new epigenetic modifications. However, there are limited studies that systematically link genetic and epigenetic alterations of DNA to the aetiology of diseases. In this work, we demonstrate how disease-related epigenetic knowledge can be systematically captured and integrated with heterogeneous information into a functional context using Biological Expression Language (BEL. This novel methodology, based on BEL, enables us to integrate epigenetic modifications such as DNA methylation or acetylation of histones into a specific disease network. As an example, we depict the integration of epigenetic and genetic factors in a functional context specific to Parkinson’s disease (PD and Multiple Sclerosis (MS.

  18. Modelling suitable estuarine habitats for Zostera noltii , using Ecological Niche Factor Analysis and Bathymetric LiDAR

    Science.gov (United States)

    Valle, Mireia; Borja, Ángel; Chust, Guillem; Galparsoro, Ibon; Garmendia, Joxe Mikel

    2011-08-01

    Predicting species distribution and habitat suitability is of considerable use in supporting the implementation of environmental legislation, protection and conservation of marine waters and ecosystem-based management. As other seagrasses, Zostera noltii has declined worldwide, mainly due to human pressures, such as eutrophication and habitat loss. In the case of the Basque Country (northern Spain), the species is present only in 3 out of 12 estuaries. From the literature, it is known that at least 6 of these estuaries were formerly vegetated by this seagrass. Consequently, efforts to monitor and restore (potential) habitats have been enhanced. Therefore, we aim: (i) to determine the main environmental variables explaining Zostera noltii distribution, within the Basque estuaries based upon the Oka estuary; (ii) to model habitat suitability for this species, as a wider applicable management-decision tool for seagrass restoration; and (iii) to assess the applicability and predicted accuracy of the model by using internal and external validation methods. For this purpose, Ecological Niche Factor Analysis (ENFA) has been used to model habitat suitability, based upon topographical variables, obtained from bathymetric Light Detection And Ranging (LiDAR); sediment characteristics variables; and hydrodynamic variables. The results obtained from the ecological factors of the ENFA (Marginality: 1.00; Specialization: 2.59) indicate that the species habitat differs considerably from the mean environmental conditions over the study area; likewise, that the species is restrictive in the selection of the range of conditions within which it dwells. The main environmental variables relating to the species distribution, in order of importance, are: mean grain size; redox potential; intertidal height; sediment sorting; slope of intertidal flat; percentage of gravels; and percentage of organic matter content. The model has a high predicted accuracy (Boyce index: 0.92). Model

  19. Best-fit model of exploratory and confirmatory factor analysis of the 2010 Medical Council of Canada Qualifying Examination Part I clinical decision-making cases

    Directory of Open Access Journals (Sweden)

    André F. De Champlain

    2015-04-01

    Full Text Available Purpose: This study aims to assess the fit of a number of exploratory and confirmatory factor analysis models to the 2010 Medical Council of Canada Qualifying Examination Part I (MCCQE1 clinical decision-making (CDM cases. The outcomes of this study have important implications for a range of domains, including scoring and test development. Methods: The examinees included all first-time Canadian medical graduates and international medical graduates who took the MCCQE1 in spring or fall 2010. The fit of one- to five-factor exploratory models was assessed for the item response matrix of the 2010 CDM cases. Five confirmatory factor analytic models were also examined with the same CDM response matrix. The structural equation modeling software program Mplus was used for all analyses. Results: Out of the five exploratory factor analytic models that were evaluated, a three-factor model provided the best fit. Factor 1 loaded on three medicine cases, two obstetrics and gynecology cases, and two orthopedic surgery cases. Factor 2 corresponded to pediatrics, and the third factor loaded on psychiatry cases. Among the five confirmatory factor analysis models examined in this study, three- and four-factor lifespan period models and the five-factor discipline models provided the best fit. Conclusion: The results suggest that knowledge of broad disciplinary domains best account for performance on CDM cases. In test development, particular effort should be placed on developing CDM cases according to broad discipline and patient age domains; CDM testlets should be assembled largely using the criteria of discipline and age.

  20. The Beliefs about Paranoia Scale: Confirmatory factor analysis and tests of a metacognitive model of paranoia in a clinical sample.

    Science.gov (United States)

    Murphy, Elizabeth K; Tully, Sarah; Pyle, Melissa; Gumley, Andrew I; Kingdon, David; Schwannauer, Matthias; Turkington, Douglas; Morrison, Anthony P

    2017-02-01

    This study aimed to confirm the factor structure of the Beliefs about Paranoia Scale (BaPS), a self-report measure to assess metacognitive beliefs about paranoia, and to test hypotheses of a metacognitive model. We hypothesised that positive and negative beliefs about paranoia would be associated with severity of suspiciousness, and that the co-occurrence of positive and negative beliefs would be associated with increased suspiciousness. A total of 335 patients meeting criteria for a schizophrenia spectrum disorder completed the BaPS, the Positive and Negative Syndromes Scale (PANSS), and the Psychotic Symptom Rating Scales (PSYRATS). Confirmatory factor analysis verified that the three BaPS subscales (negative beliefs about paranoia, paranoia as a survival strategy, and normalizing beliefs) were an adequate fit of the data. Ordinal regression showed that positive beliefs about paranoia as a survival strategy and negative beliefs were both associated with severity of suspiciousness. This was the first study to show that the co-occurrence of positive and negative beliefs was associated with increased suspiciousness. All hypotheses were confirmed, suggesting that a metacognitive approach has utility for the conceptualization of paranoia. Clinical implications suggest a role for metacognitive therapy, including strategies such as detached mindfulness and worry postponement.

  1. Study of Factors Preventing Children from Enrolment in Primary School in the Republic of Honduras: Analysis Using Structural Equation Modelling

    Science.gov (United States)

    Ashida, Akemi

    2015-01-01

    Studies have investigated factors that impede enrolment in Honduras. However, they have not analysed individual factors as a whole or identified the relationships among them. This study used longitudinal data for 1971 children who entered primary schools from 1986 to 2000, and employed structural equation modelling to examine the factors…

  2. Risk Factors of Work-related Upper Extremity Musculoskeletal Disorders in Male Shipyard Workers: Structural Equation Model Analysis

    Directory of Open Access Journals (Sweden)

    Byung-Chan Park

    2010-12-01

    Conclusion: The model in this study provides a better approximation of the complexity of the actual relationship between risk factors and work-related musculoskeletal disorders. Among the variables evaluated in this study, physical factors (work posture had the strongest association with musculoskeletal disorders.

  3. Genome-wide analysis of auxin response factor gene family members in medicinal model plant Salvia miltiorrhiza

    Directory of Open Access Journals (Sweden)

    Zhichao Xu

    2016-06-01

    Full Text Available Auxin response factors (ARFs can function as transcriptional activators or repressors to regulate the expression of auxin response genes by specifically binding to auxin response elements (AuxREs during plant development. Based on a genome-wide strategy using the medicinal model plant Salvia miltiorrhiza, 25 S. miltiorrhiza ARF (SmARF gene family members in four classes (class Ia, IIa, IIb and III were comprehensively analyzed to identify characteristics including gene structures, conserved domains, phylogenetic relationships and expression patterns. In a hybrid analysis of the phylogenetic tree, microRNA targets, and expression patterns of SmARFs in different organs, root tissues, and methyl jasmonate or indole-3-acetic acid treatment conditions, we screened for candidate SmARFs involved in various developmental processes of S. miltiorrhiza. Based on this analysis, we predicted that SmARF25, SmARF7, SmARF16 and SmARF20 are involved in flower, leaf, stem and root development, respectively. With the further insight into the targets of miR160 and miR167, specific SmARF genes in S. miltiorrhiza might encode products that participate in biological processes as described for ARF genes in Arabidopsis. Our results provide a foundation for understanding the molecular basis and regulatory mechanisms of SmARFs in S. miltiorrhiza.

  4. The Relationship between Root Mean Square Error of Approximation and Model Misspecification in Confirmatory Factor Analysis Models

    Science.gov (United States)

    Savalei, Victoria

    2012-01-01

    The fit index root mean square error of approximation (RMSEA) is extremely popular in structural equation modeling. However, its behavior under different scenarios remains poorly understood. The present study generates continuous curves where possible to capture the full relationship between RMSEA and various "incidental parameters," such as…

  5. Model Analysis of the Factors Regulating Trends and Variability of Methane, Carbon Monoxide and OH: 1. Model Validation

    Science.gov (United States)

    Elshorbany, Y. F.; Strode, S.; Wang, J.; Duncan, B.

    2014-01-01

    Methane (CH4) is the second most important anthropogenic greenhouse gas (GHG). Its 100-year global warming potential (GWP) is 25 times larger than that for carbon dioxide. The 100-yr integrated GWP of CH4 is sensitive to changes in OH levels. Methane's atmospheric growth rate was estimated to be more than 10 ppb yr(exp -1) in 1998 but less than zero in 2001, 2004 and 2005 (Kirschke et al., 2013). Since 2006, the CH4 is increasing again. This phenomena is yet not well understood. Oxidation of CH4 by OH is the main loss process, thus affecting the oxidizing capacity of the atmosphere and contributing to the global ozone background. Current models typically use an annual cycle of offline OH fields to simulate CH4. The implemented OH fields in these models are typically tuned so that simulated CH4 growth rates match that measured. For future and climate simulations, the OH tuning technique may not be suitable. In addition, running full chemistry, multi-decadal CH4 simulations is a serious challenge and currently, due to computational intensity, almost impossible.

  6. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  7. Application of Parametric Models of Survival Analysis in Determining the Cancer Influencing Factors in Patients with Thyroid Nodules

    Directory of Open Access Journals (Sweden)

    J Yazdani Charati

    2015-06-01

    Full Text Available Background & aim: One of the most common clinical problems among individuals is thyroid nodule diseases which are characterized by one or more nodules in the thyroid and are usually benign. It can be said that thyroid cancer is the most common endocrine cancer worldwide. This study aimed to determine the risk factors for cancer in patients with thyroid nodule in Mazandaran province,Iran, using parametric survival analysis. Methods: In the present historical cohort study, 26,730 patients with thyroid nodules who were referred to health care centers from July 2002 to March 2008 were identified. Parametric log-normal and log-logistic models were compared with and without taking frailty into account. The criterion for comparing models was Akaike's criterion. All calculations were performed with the SPSS software and the significance level was considered 0.05. Results: The mean time of the conversion of thyroid nodules to cancer in patients was found to be 29.32 months. Using Kaplan-Meier method, survival rates of one year, five years and ten years of nodule conversion to cancer was calculated 94.6, 88.6 and respectively. According to the log rank test age (p=0.03, hypothyroidism (p=0.01, bilateral nodules (p <0.001, a multi-nodular goiter (p <0.001, TSH hormone (p <0.001, T4 hormones (p = 0.005, cholesterol (p = 0.03, creatinin levels (p = 0.001 a significant relationship was seen. Based on the Akaike's criterion, the lognormal model which takes frailty into account best fits to the data. Conclusion: Based on the log-normal model with frailty, It can be concluded that the thyroid nodule patients with abnormal TSH hormone are 6.55 times more likely to develop risk of thyroid cancer than patients who had normal TSH hormone overall. This model also indicated that patients who had heart palpitations are 5.52 times more likely to develop risk of cancer than patients who did not have heart palpitations.

  8. Analysis of risk factors and the establishment of a risk model for peripherally inserted central catheter thrombosis

    Institute of Scientific and Technical Information of China (English)

    Fang Hu; Ruo-Nan Hao; Jie Zhang; Zhi-Cheng Ma

    2016-01-01

    Objective: To investigate the main risk factors of peripherally inserted central catheter (PICC) related upper extremity deep venous thrombosis and establish the risk predictive model of PICC-related upper extremity deep venous thrombosis. Methods: Patients with PICC who were hospitalized between January 2014 and July 2015 were studied retrospectively; they were divided into a thrombosis group (n ¼ 52), with patients who had a venous thrombosis complication after PICC, and a no-thrombosis group (n ¼ 144), with patients without venous thrombosis. To compare between the two groups, significantly different variables were selected to perform multivariate logistic regression to establish the risk-predictive model. Results: The PICC catheter history, catheter tip position, and diameter of blood vessel were the key factors for thrombosis. The logistic regression predictive model was as follows:Y ¼ 3.338 þ 2.040 ? PICC catheter history þ1.964? catheter tip position ?1.572? diameter of vessel. The area under the receiver operating characteristic curve for the model was 0.872, 95%CI (0.817e0.927). The cut-off point was 0.801, the sensitivity of the model was 0.832, and the specificity was 0.745. Conclusions: The PICC catheterization history, catheter tip position, the diameter of blood vessel were the key factors for thrombosis. The logistic regression risk model based on these factors is reliable for predicting PICC-related upper extremity deep venous thrombosis.

  9. A Case Study of Probit Model Analysis of Factors Affecting Consumption of Packed and Unpacked Milk in Turkey

    OpenAIRE

    Uzunoz, Meral; Akcay, Yasar

    2012-01-01

    This paper focused on the effects of some sociodemographic factors on the decision of the consumer to purchase packed or unpacked fluid milk in Sivas, Turkey. The data were collected from 300 consumers by using face-to-face survey technique. The sample size was determined using the possibility-sampling method. Probit model has been used to analyze the socioeconomic factors affecting milk consumption of households. Four estimators (household size, income, milk preferences reason, and milk pric...

  10. Analysis of Influencing Factors of Water Footprint Based on the STIRPAT Model: Evidence from the Beijing Agricultural Sector

    OpenAIRE

    Chen Jin; Kai Huang; Yajuan Yu; Yue Zhang

    2016-01-01

    Beijing suffers from a severe water shortage. To find the key factors that impact the agricultural water footprint (WF) within Beijing to relieve the pressure on water resources, this study quantifies the agricultural WF within Beijing from 1980 to 2012 and examines the factors of population, urbanization level, GDP per capita, Engel coefficient, and total rural power using an extended stochastic impact by regression on population, affluence and technology (STIRPAT) model. Ridge regression is...

  11. Using Hospital Anxiety and Depression Scale (HADS) on patients with epilepsy: Confirmatory factor analysis and Rasch models.

    Science.gov (United States)

    Lin, Chung-Ying; Pakpour, Amir H

    2017-02-01

    The problems of mood disorders are critical in people with epilepsy. Therefore, there is a need to validate a useful tool for the population. The Hospital Anxiety and Depression Scale (HADS) has been used on the population, and showed that it is a satisfactory screening tool. However, more evidence on its construct validity is needed. A total of 1041 people with epilepsy were recruited in this study, and each completed the HADS. Confirmatory factor analysis (CFA) and Rasch analysis were used to understand the construct validity of the HADS. In addition, internal consistency was tested using Cronbachs' α, person separation reliability, and item separation reliability. Ordering of the response descriptors and the differential item functioning (DIF) were examined using the Rasch models. The HADS showed that 55.3% of our participants had anxiety; 56.0% had depression based on its cutoffs. CFA and Rasch analyses both showed the satisfactory construct validity of the HADS; the internal consistency was also acceptable (α=0.82 in anxiety and 0.79 in depression; person separation reliability=0.82 in anxiety and 0.73 in depression; item separation reliability=0.98 in anxiety and 0.91 in depression). The difficulties of the four-point Likert scale used in the HADS were monotonically increased, which indicates no disordering response categories. No DIF items across male and female patients and across types of epilepsy were displayed in the HADS. The HADS has promising psychometric properties on construct validity in people with epilepsy. Moreover, the additive item score is supported for calculating the cutoff. Copyright © 2016 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  12. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    Science.gov (United States)

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters.

  13. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  14. Factor Analysis of Intern Effectiveness

    Science.gov (United States)

    Womack, Sid T.; Hannah, Shellie Louise; Bell, Columbus David

    2012-01-01

    Four factors in teaching intern effectiveness, as measured by a Praxis III-similar instrument, were found among observational data of teaching interns during the 2010 spring semester. Those factors were lesson planning, teacher/student reflection, fairness & safe environment, and professionalism/efficacy. This factor analysis was as much of a…

  15. Kinetic Model Facilitates Analysis of Fibrin Generation and Its Modulation by Clotting Factors: Implications for Hemostasis-Enhancing Therapies

    Science.gov (United States)

    2014-01-01

    facilitates analysis of fibrin generation and its modulation by clotting factors: implications for hemostasis-enhancing therapies† Alexander Y...Syst. Biol.Med., 2011, 3, 136–146. 62 M. Schneider, N. Brufatto, E. Neill and M. Nesheim, J. Biol. Chem., 2004, 279, 13340–13345. 63 J. H. Foley, P. F

  16. Factors influencing voluntary premarital medical examination in Zhejiang province, China: a culturally-tailored health behavioral model analysis

    Science.gov (United States)

    2014-01-01

    Background Premarital medical examination (PME) compliance rate has dropped drastically since it became voluntary in China in 2003. This study aimed to establish a prediction model to be a theoretic framework for analyzing factors affecting PME compliance in Zhejiang province, China. Methods A culturally-tailored health behavioral model combining the Health Behavioral Model (HBM) and the Theory of Reasoned Action (TRA) was established to analyze the data from a cross-sectional questionnaire survey (n = 2,572) using the intercept method at the county marriage registration office in 12 counties from Zhejiang in 2010. Participants were grouped by high (n = 1,795) and low (n = 777) social desirability responding tendency (SDRT) by Marlowe-Crowne Social Desirability Scale (MCSDS). A structural equation modeling (SEM) was conducted to evaluate behavioral determinants for their influences on PME compliance in both high and low SDRT groups. Results 69.8% of the participants had high SDRT and tended to overly report benefits and underreport barriers, which may affect prediction accuracy on PME participation. In the low SDRT group, the prediction model showed the most influencing factor on PME compliance was behavioral intention, with standardized structural coefficients (SSCs) being 0.75 (P social environmental factors. The verified prediction model was tested to be an effective theoretic framework for the prediction of factors affecting voluntary PME compliance. It also should be noted that internationally available behavioral theories and models need to be culturally tailored to adapt to particular populations. This study has provided new insights for establishing a theoretical model to understand health behaviors in China. PMID:24972866

  17. Theoretical Analysis Model of the Adoption of Reactive and Proactive Eco-Innovation Strategies: the Influence of Contextual Factors Internal and External to Organizations

    Directory of Open Access Journals (Sweden)

    Marlete Beatriz Maçaneiro

    2014-01-01

    Full Text Available This article aims to provide research propositions of the relationship between internal and external contextual factors and strategies for reac tive and proactive eco-innovation, defining the theoretical analysis model from these propositi ons. Therefore, this study is characterized as a theoretical essay, adopting the methodological procedure of literature review, through an analysis of national and international articles on the subject of eco-innovation strategy. As a result, the external and internal factors that affe ct the adoption of eco-innovation strategies were defined and propositions were made for their r elationship. With this theoretical approach and the establishment of the factors, it was possib le to design the theoretical model to analyze the impact of contextual factors on the adoption of eco-innovation strategies.

  18. Shell model and spectroscopic factors

    Energy Technology Data Exchange (ETDEWEB)

    Poves, P. [Madrid Univ. Autonoma and IFT, UAM/CSIC, E-28049 (Spain)

    2007-07-01

    In these lectures, I introduce the notion of spectroscopic factor in the shell model context. A brief review is given of the present status of the large scale applications of the Interacting Shell Model. The spectroscopic factors and the spectroscopic strength are discussed for nuclei in the vicinity of magic closures and for deformed nuclei. (author)

  19. Choice Model and Influencing Factor Analysis of Travel Mode for Migrant Workers: Case Study in Xi’an, China

    Directory of Open Access Journals (Sweden)

    Hong Chen

    2015-01-01

    Full Text Available Based on the basic theory and methods of disaggregate choice model, the influencing factors in travel mode choice for migrant workers are analyzed, according to 1366 data samples of Xi’an migrant workers. Walking, bus, subway, and taxi are taken as the alternative parts of travel modes for migrant workers, and a multinomial logit (MNL model of travel mode for migrant workers is set up. The validity of the model is verified by the hit rate, and the hit rates of four travel modes are all greater than 80%. Finally, the influence of different factors affecting the choice of travel mode is analyzed in detail, and the inelasticity of each factor is analyzed with the elasticity theory. Influencing factors such as age, education level, and monthly gross income have significant impact on travel choice mode for migrant workers. The elasticity values of education degree are greater than 1, indicating that it on the travel mode choice is of elasticity, while the elasticity values of gender, industry distribution, and travel purpose are less than 1, indicating that these factors on travel mode choice are of inelasticity.

  20. Integrating Factor Analysis and a Transgenic Mouse Model to Reveal a Peripheral Blood Predictor of Breast Tumors

    Directory of Open Access Journals (Sweden)

    Nevins Joseph R

    2011-07-01

    Full Text Available Abstract Background Transgenic mouse tumor models have the advantage of facilitating controlled in vivo oncogenic perturbations in a common genetic background. This provides an idealized context for generating transcriptome-based diagnostic models while minimizing the inherent noisiness of high-throughput technologies. However, the question remains whether models developed in such a setting are suitable prototypes for useful human diagnostics. We show that latent factor modeling of the peripheral blood transcriptome in a mouse model of breast cancer provides the basis for using computational methods to link a mouse model to a prototype human diagnostic based on a common underlying biological response to the presence of a tumor. Methods We used gene expression data from mouse peripheral blood cell (PBC samples to identify significantly differentially expressed genes using supervised classification and sparse ANOVA. We employed these transcriptome data as the starting point for developing a breast tumor predictor from human peripheral blood mononuclear cells (PBMCs by using a factor modeling approach. Results The predictor distinguished breast cancer patients from healthy individuals in a cohort of patients independent from that used to build the factors and train the model with 89% sensitivity, 100% specificity and an area under the curve (AUC of 0.97 using Youden's J-statistic to objectively select the model's classification threshold. Both permutation testing of the model and evaluating the model strategy by swapping the training and validation sets highlight its stability. Conclusions We describe a human breast tumor predictor based on the gene expression of mouse PBCs. This strategy overcomes many of the limitations of earlier studies by using the model system to reduce noise and identify transcripts associated with the presence of a breast tumor over other potentially confounding factors. Our results serve as a proof-of-concept for using an

  1. Analysis of Total Factor Efficiency of Water Resource and Energy in China: A Study Based on DEA-SBM Model

    Directory of Open Access Journals (Sweden)

    Weixin Yang

    2017-07-01

    Full Text Available One of the serious issues that China faces during its fast economic development is the low input–output efficiency of water and energy resources and growing water pollution. With the current economic development model of China, economic growth still requires large input of water resource and energy resource. This paper has focused on the total factor efficiency of water resource and energy resource by each province in China. We treat the undesirable outputs as outputs in the DEA-SBM Model instead of as inputs in previous studies, and design a new MATLAB programming to achieve optimization solutions of multi-variable constrained nonlinear functions to evaluate the Total Factor Efficiency of Water resource (TFEW and the Total Factor Efficiency of Energy (TFEE in China accurately. By using the method, this paper has analyzed the TFEW and TFEE in China from 2003 to 2014 by economic zones and typical provinces and provided corresponding policy recommendations.

  2. The cyclical component factor model

    DEFF Research Database (Denmark)

    Dahl, Christian Møller; Hansen, Henrik; Smidt, John

    Forecasting using factor models based on large data sets have received ample attention due to the models' ability to increase forecast accuracy with respect to a range of key macroeconomic variables in the US and the UK. However, forecasts based on such factor models do not uniformly outperform...... the simple autoregressive model when using data from other countries. In this paper we propose to estimate the factors based on the pure cyclical components of the series entering the large data set. Monte Carlo evidence and an empirical illustration using Danish data shows that this procedure can indeed...

  3. A model combining spectrum standardization and dominant factor based partial least square method for carbon analysis in coal using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiongwei; Wang, Zhe, E-mail: zhewang@tsinghua.edu.cn; Fu, Yangting; Li, Zheng; Ni, Weidou

    2014-09-01

    Quantitative measurement of carbon content in coal is essentially important for coal property analysis. However, quantitative measurement of carbon content in coal using laser-induced breakdown spectroscopy (LIBS) suffered from low measurement accuracy due to measurement uncertainty as well as the matrix effects. In this study, our previously proposed spectrum standardization method and dominant factor based partial least square (PLS) method were combined to improve the measurement accuracy of carbon content in coal using LIBS. The combination model utilized the spectrum standardization method to accurately calculate dominant carbon concentration as the dominant factor, and then applied PLS with full spectrum information to correct residual errors. The combination model was applied to measure the carbon content in 24 bituminous coal samples. Results demonstrated that the combination model can further improve measurement accuracy compared with the spectrum standardization model and the dominant factor based PLS model, in which the dominant factor was calculated using traditional univariate method. The coefficient of determination, root-mean-square error of prediction, and average relative error for the combination model were 0.99, 1.63%, and 1.82%, respectively. The values for the spectrum standardization model were 0.90, 2.24%, and 2.75%, respectively, whereas those for the dominant factor based PLS model were 0.99, 2.66%, and 3.64%, respectively. The results indicate that LIBS has great potential to be applied for the coal analysis. - Highlights: • Spectrum standardization method is utilized to establish a more accurate dominant factor model. • PLS algorithm is applied to further compensate for residual errors using the entire spectrum information. • Measurement accuracy is improved.

  4. Bayesian Constrained-Model Selection for Factor Analytic Modeling

    OpenAIRE

    Peeters, Carel F.W.

    2016-01-01

    My dissertation revolves around Bayesian approaches towards constrained statistical inference in the factor analysis (FA) model. Two interconnected types of restricted-model selection are considered. These types have a natural connection to selection problems in the exploratory FA (EFA) and confirmatory FA (CFA) model and are termed Type I and Type II model selection. Type I constrained-model selection is taken to mean the determination of the appropriate dimensionality of a model. This type ...

  5. Impact of DEM Resolution and Spatial Scale: Analysis of Influence Factors and Parameters on Physically Based Distributed Model

    Directory of Open Access Journals (Sweden)

    Hanchen Zhang

    2016-01-01

    Full Text Available Physically based distributed hydrological models were used to describe small-scale hydrological information in detail. However, the sensitivity of the model to spatially varied parameters and inputs limits the accuracy for application. In this paper, relevant influence factors and sensitive parameters were analyzed to solve this problem. First, a set of digital elevation model (DEM resolutions and channel thresholds were generated to extract the hydrological influence factors. Second, a numerical relationship between sensitive parameters and influence factors was established to define parameters reasonably. Next, the topographic index (TI was computed to study the similarity. At last, simulation results were analyzed in two different ways: (1 to observe the change regularity of influence factors and sensitive parameters through the variation of DEM resolutions and channel thresholds and (2 to compare the simulation accuracy of the nested catchment, particularly in the subcatchments and interior grids. Increasing the grid size from 250 m to 1000 m, the TI increased from 9.08 to 11.16 and the Nash-Sutcliffe efficiency (NSE decreased from 0.77 to 0.75. Utilizing the parameters calculated by the established relationship, the simulation results show the same NSE in the outlet and a better NSE in the simple subcatchment than the calculated interior grids.

  6. Inclusion of climatic and touristic factors in the analysis and modelling of the municipal water demand in a Mediterranean region

    Science.gov (United States)

    Toth, Elena; Bragalli, Cristiana; Neri, Mattia

    2017-04-01

    In Mediterranean regions, inherently affected by water scarcity conditions, the gap between water availability and demand may further increase in the near future due to both climatic and anthropogenic drivers. In particular, the high degree of urbanization and the concentration of population and activities in coastal areas is often severely impacting the water availability also for the residential sector. It is therefore crucial analysing the importance of both climatic and touristic factors as drivers for the water demand in such areas, to better understand and model the expected consumption in order to improve the water management policies and practices. The study presents an analysis referred to a large number of municipalities, covering almost the whole Romagna region, in Northern Italy, representing one of the most economically developed areas in Europe and characterized by an extremely profitable tourist industry, especially in the coastal cities. For this region it is therefore extremely important to assess the significance of the drivers that may influence the demand in the different periods of the year, that is climatic factors (rainfall depths and occurrence, temperature averages and extremes), but also the presence of tourists, in both official tourist accommodation structures and in holidays homes (and the latter are very difficult to estimate). Analyses on the Italian water industry at seasonal or monthly time scale has been so far, extremely limited in the literature by the scarce availability of data on the water demands, that are made public only as annual volumes. All the study municipalities are supplied by the same water company, who provided monthly consumption volumes data at the main inlet points of the entire distribution network for a period of 7 years (2009-2015). For the same period, precipitation and temperature data have been collected and summarised in indexes representing monthly averages, days of occurrence and over threshold values

  7. Risk factors and visual fatigue of baggage X-ray security screeners: a structural equation modelling analysis.

    Science.gov (United States)

    Yu, Rui-Feng; Yang, Lin-Dong; Wu, Xin

    2016-06-03

    This study identified the risk factors influencing visual fatigue in baggage X-ray security screeners and estimated the strength of correlations between those factors and visual fatigue using structural equation modelling approach. Two hundred and five X-ray security screeners participated in a questionnaire survey. The result showed that satisfaction with the VDT's physical features and the work environment conditions were negatively correlated with the intensity of visual fatigue, whereas job stress and job burnout had direct positive influences. The path coefficient between the image quality of VDT and visual fatigue was not significant. The total effects of job burnout, job stress, the VDT's physical features and the work environment conditions on visual fatigue were 0.471, 0.469, -0.268 and -0.251 respectively. These findings indicated that both extrinsic factors relating to VDT and workplace environment and psychological factors including job burnout and job stress should be considered in the workplace design and work organisation of security screening tasks to reduce screeners' visual fatigue. Practitioner Summary: This study identified the risk factors influencing visual fatigue in baggage X-ray security screeners and estimated the strength of correlations between those factors and visual fatigue. The findings were of great importance to the workplace design and the work organisation of security screening tasks to reduce screeners' visual fatigue.

  8. Application of Parametric Models of Survival Analysis in Determining the Cancer Influencing Factors in Patients with Thyroid Nodules

    OpenAIRE

    J.Yazdani Charati; O. Akha; AR Baghestani; F. Khosravi; Y Kavyani Charati

    2015-01-01

    Background & aim: One of the most common clinical problems among individuals is thyroid nodule diseases which are characterized by one or more nodules in the thyroid and are usually benign. It can be said that thyroid cancer is the most common endocrine cancer worldwide. This study aimed to determine the risk factors for cancer in patients with thyroid nodule in Mazandaran province,Iran, using parametric survival analysis. Methods: In the present historical cohort study, 26,730 patients w...

  9. Analysis of Influencing Factors of Water Footprint Based on the STIRPAT Model: Evidence from the Beijing Agricultural Sector

    Directory of Open Access Journals (Sweden)

    Chen Jin

    2016-11-01

    Full Text Available Beijing suffers from a severe water shortage. To find the key factors that impact the agricultural water footprint (WF within Beijing to relieve the pressure on water resources, this study quantifies the agricultural WF within Beijing from 1980 to 2012 and examines the factors of population, urbanization level, GDP per capita, Engel coefficient, and total rural power using an extended stochastic impact by regression on population, affluence and technology (STIRPAT model. Ridge regression is employed to fit the extended STIRPAT model. The empirical results reveal that the Engel coefficient, which is defined as the total amount of food expenses accounted for the proportion of total personal consumption expenditures, has the largest positive impact on the increase in the agricultural WF, followed by urbanization. In contrast, total rural power, population, and GDP per capita can decrease the agricultural WF. Finally, policy recommendations from technological development, agriculture plantation structure adjustment, and virtual water imports are provided to cope with water shortages.

  10. A comparative analysis of transcription factor binding models learned from PBM, HT-SELEX and ChIP data.

    Science.gov (United States)

    Orenstein, Yaron; Shamir, Ron

    2014-04-01

    Understanding gene regulation is a key challenge in today's biology. The new technologies of protein-binding microarrays (PBMs) and high-throughput SELEX (HT-SELEX) allow measurement of the binding intensities of one transcription factor (TF) to numerous synthetic double-stranded DNA sequences in a single experiment. Recently, Jolma et al. reported the results of 547 HT-SELEX experiments covering human and mouse TFs. Because 162 of these TFs were also covered by PBM technology, for the first time, a large-scale comparison between implementations of these two in vitro technologies is possible. Here we assessed the similarities and differences between binding models, represented as position weight matrices, inferred from PBM and HT-SELEX, and also measured how well these models predict in vivo binding. Our results show that HT-SELEX- and PBM-derived models agree for most TFs. For some TFs, the HT-SELEX-derived models are longer versions of the PBM-derived models, whereas for other TFs, the HT-SELEX models match the secondary PBM-derived models. Remarkably, PBM-based 8-mer ranking is more accurate than that of HT-SELEX, but models derived from HT-SELEX predict in vivo binding better. In addition, we reveal several biases in HT-SELEX data including nucleotide frequency bias, enrichment of C-rich k-mers and oligos and underrepresentation of palindromes.

  11. The beliefs about paranoia scale: confirmatory factor analysis and tests of a metacognitive model of paranoia in a clinical sample

    OpenAIRE

    Murphy, Elizabeth K.; Tully, Sarah; Pyle, Melissa; Gumley, Andrew I.; Kingdon, David; Schwannauer, Matthias; Turkington, Douglas; Morrison, Anthony P.

    2017-01-01

    This study aimed to confirm the factor structure of the Beliefs about Paranoia Scale (BaPS), a self-report measure to assess metacognitive beliefs about paranoia, and to test hypotheses of a metacognitive model. We hypothesised that positive and negative beliefs about paranoia would be associated with severity of suspiciousness, and that the co-occurrence of positive and negative beliefs would be associated with increased suspiciousness. A total of 335 patients meeting criteria for a schizoph...

  12. Stepwise Variable Selection in Factor Analysis.

    Science.gov (United States)

    Kano, Yutaka; Harada, Akira

    2000-01-01

    Takes several goodness-of-fit statistics as measures of variable selection and develops backward elimination and forward selection procedures in exploratory factor analysis. A newly developed variable selection program, SEFA, can print several fit measures for a current model and models obtained by removing an internal variable or adding an…

  13. A Case Study of Probit Model Analysis of Factors Affecting Consumption of Packed and Unpacked Milk in Turkey

    Directory of Open Access Journals (Sweden)

    Meral Uzunoz

    2012-01-01

    Full Text Available This paper focused on the effects of some sociodemographic factors on the decision of the consumer to purchase packed or unpacked fluid milk in Sivas, Turkey. The data were collected from 300 consumers by using face-to-face survey technique. The sample size was determined using the possibility-sampling method. Probit model has been used to analyze the socioeconomic factors affecting milk consumption of households. Four estimators (household size, income, milk preferences reason, and milk price in the probit model were found statistically significant. According to empirical results, consumers with lower household size and higher income levels tend to consume packed milk consumption. Our study findings suggest that consumers who were sensitive to price were less likely to consume packed milk and believe that packed milk price is expensive compared to unpacked milk price. Also, milk price was effective factor concerning packed and unpacked milk consumption behavior. The majority of consumers read the contents of packed fluid milk and are affected by safety food in their shopping preferences.

  14. Longitudinal analysis of osteogenic and angiogenic signaling factors in healing models mimicking atrophic and hypertrophic non-unions in rats.

    Directory of Open Access Journals (Sweden)

    Susann Minkwitz

    Full Text Available Impaired bone healing can have devastating consequences for the patient. Clinically relevant animal models are necessary to understand the pathology of impaired bone healing. In this study, two impaired healing models, a hypertrophic and an atrophic non-union, were compared to physiological bone healing in rats. The aim was to provide detailed information about differences in gene expression, vascularization and histology during the healing process. The change from a closed fracture (healing control group to an open osteotomy (hypertrophy group led to prolonged healing with reduced mineralized bridging after 42 days. RT-PCR data revealed higher gene expression of most tested osteogenic and angiogenic factors in the hypertrophy group at day 14. After 42 days a significant reduction of gene expression was seen for Bmp4 and Bambi in this group. The inhibition of angiogenesis by Fumagillin (atrophy group decreased the formation of new blood vessels and led to a non-healing situation with diminished chondrogenesis. RT-PCR results showed an attempt towards overcoming the early perturbance by significant up regulation of the angiogenic regulators Vegfa, Angiopoietin 2 and Fgf1 at day 7 and a further continuous increase of Fgf1, -2 and Angiopoietin 2 over time. However µCT angiograms showed incomplete recovery after 42 days. Furthermore, lower expression values were detected for the Bmps at day 14 and 21. The Bmp antagonists Dan and Twsg1 tended to be higher expressed in the atrophy group at day 42. In conclusion, the investigated animal models are suitable models to mimic human fracture healing complications and can be used for longitudinal studies. Analyzing osteogenic and angiogenic signaling patterns, clear changes in expression were identified between these three healing models, revealing the importance of a coordinated interplay of different factors to allow successful bone healing.

  15. DeFCoM: analysis and modeling of transcription factor binding sites using a motif-centric genomic footprinter.

    Science.gov (United States)

    Quach, Bryan; Furey, Terrence S

    2017-04-01

    Identifying the locations of transcription factor binding sites is critical for understanding how gene transcription is regulated across different cell types and conditions. Chromatin accessibility experiments such as DNaseI sequencing (DNase-seq) and Assay for Transposase Accessible Chromatin sequencing (ATAC-seq) produce genome-wide data that include distinct 'footprint' patterns at binding sites. Nearly all existing computational methods to detect footprints from these data assume that footprint signals are highly homogeneous across footprint sites. Additionally, a comprehensive and systematic comparison of footprinting methods for specifically identifying which motif sites for a specific factor are bound has not been performed. Using DNase-seq data from the ENCODE project, we show that a large degree of previously uncharacterized site-to-site variability exists in footprint signal across motif sites for a transcription factor. To model this heterogeneity in the data, we introduce a novel, supervised learning footprinter called Detecting Footprints Containing Motifs (DeFCoM). We compare DeFCoM to nine existing methods using evaluation sets from four human cell-lines and eighteen transcription factors and show that DeFCoM outperforms current methods in determining bound and unbound motif sites. We also analyze the impact of several biological and technical factors on the quality of footprint predictions to highlight important considerations when conducting footprint analyses and assessing the performance of footprint prediction methods. Finally, we show that DeFCoM can detect footprints using ATAC-seq data with similar accuracy as when using DNase-seq data. Python code available at https://bitbucket.org/bryancquach/defcom. bquach@email.unc.edu or tsfurey@email.unc.edu. Supplementary data are available at Bioinformatics online.

  16. Using factor analysis scales of generalized amino acid information for prediction and characteristic analysis of β-turns in proteins based on a support vector machine model

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper offers a new combined approach to predict and characterize β-turns in proteins.The approach includes two key steps,i.e.,how to represent the features of β-turns and how to develop a predictor.The first step is to use factor analysis scales of generalized amino acid information(FASGAI),involving hydrophobicity,alpha and turn propensities,bulky properties,compositional characteristics,local flexibility and electronic properties,to represent the features of β-turns in proteins.The second step is to construct a support vector machine(SVM) predictor of β-turns based on 426 training proteins by a sevenfold cross validation test.The SVM predictor thus predicted β-turns on 547 and 823 proteins by an external validation test,separately.Our results are compared with the previously best known β-turn prediction methods and are shown to give comparative performance.Most significantly,the SVM model provides some information related to β-turn residues in proteins.The results demonstrate that the present combination approach may be used in the prediction of protein structures.

  17. Modeling contextual effects using individual-level data and without aggregation: an illustration of multilevel factor analysis (MLFA) with collective efficacy.

    Science.gov (United States)

    Dunn, Erin C; Masyn, Katherine E; Johnston, William R; Subramanian, S V

    2015-01-01

    Population health scientists increasingly study how contextual-level attributes affect individual health. A major challenge in this domain relates to measurement, i.e., how best to measure and create variables that capture characteristics of individuals and their embedded contexts. This paper presents an illustration of multilevel factor analysis (MLFA), an analytic method that enables researchers to model contextual effects using individual-level data without using derived variables. MLFA uses the shared variance in sets of observed items among individuals within the same context to estimate a measurement model for latent constructs; it does this by decomposing the total sample variance-covariance matrix into within-group (e.g., individual-level) and between-group (e.g., contextual-level) matrices and simultaneously modeling distinct latent factor structures at each level. We illustrate the MLFA method using items capturing collective efficacy, which were self-reported by 2,599 adults in 65 census tracts from the Los Angeles Family and Neighborhood Survey (LAFANS). MLFA identified two latent factors at the individual level and one factor at the neighborhood level. Indicators of collective efficacy performed differently at each level. The ability of MLFA to identify different latent factor structures at each level underscores the utility of this analytic tool to model and identify attributes of contexts relevant to health.

  18. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  19. A nonlinearized multivariate dominant factor-based partial least squares (PLS) model for coal analysis by using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Feng, Jie; Wang, Zhe; Li, Lizhi; Li, Zheng; Ni, Weidou

    2013-03-01

    A nonlinearized multivariate dominant factor-based partial least-squares (PLS) model was applied to coal elemental concentration measurement. For C concentration determination in bituminous coal, the intensities of multiple characteristic lines of the main elements in coal were applied to construct a comprehensive dominant factor that would provide main concentration results. A secondary PLS thereafter applied would further correct the model results by using the entire spectral information. In the dominant factor extraction, nonlinear transformation of line intensities (based on physical mechanisms) was embedded in the linear PLS to describe nonlinear self-absorption and inter-element interference more effectively and accurately. According to the empirical expression of self-absorption and Taylor expansion, nonlinear transformations of atomic and ionic line intensities of C were utilized to model self-absorption. Then, the line intensities of other elements, O and N, were taken into account for inter-element interference, considering the possible recombination of C with O and N particles. The specialty of coal analysis by using laser-induced breakdown spectroscopy (LIBS) was also discussed and considered in the multivariate dominant factor construction. The proposed model achieved a much better prediction performance than conventional PLS. Compared with our previous, already improved dominant factor-based PLS model, the present PLS model obtained the same calibration quality while decreasing the root mean square error of prediction (RMSEP) from 4.47 to 3.77%. Furthermore, with the leave-one-out cross-validation and L-curve methods, which avoid the overfitting issue in determining the number of principal components instead of minimum RMSEP criteria, the present PLS model also showed better performance for different splits of calibration and prediction samples, proving the robustness of the present PLS model.

  20. A meta-analytic review of the relationships between the five-factor model and DSM-IV-TR personality disorders: a facet level analysis.

    Science.gov (United States)

    Samuel, Douglas B; Widiger, Thomas A

    2008-12-01

    Theory and research have suggested that the personality disorders contained within the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR) can be understood as maladaptive variants of the personality traits included within the five-factor model (FFM). The current meta-analysis of FFM personality disorder research both replicated and extended the 2004 work of Saulsman and Page (The five-factor model and personality disorder empirical literature: A meta-analytic review. Clinical Psychology Review, 23, 1055-1085) through a facet level analysis that provides a more specific and nuanced description of each DSM-IV-TR personality disorder. The empirical FFM profiles generated for each personality disorder were generally congruent at the facet level with hypothesized FFM translations of the DSM-IV-TR personality disorders. However, notable exceptions to the hypotheses did occur and even some findings that were consistent with FFM theory could be said to be instrument specific.

  1. An innovation resistance factor model

    Directory of Open Access Journals (Sweden)

    Siti Salwa Mohd Ishak

    2016-09-01

    Full Text Available The process and implementation strategy of information technology in construction is generally considered through the limiting prism of theoretical contexts generated from innovation diffusion and acceptance. This research argues that more attention should be given to understanding the positive effects of resistance. The study develops a theoretical framing for the Integrated Resistance Factor Model (IRFM. The framing uses a combination of diffusion of innovation theory, technology acceptance model and social network perspective. The model is tested to identify the most significant resistance factors using Partial Least Square (PLS technique. All constructs proposed in the model are found to be significant, valid and consistent with the theoretical framework. IRFM is shown to be an effective and appropriate model of user resistance factors. The most critical factors to influence technology resistance in the online project information management system (OPIMS context are: support from leaders and peers, complexity of the technology, compatibility with key work practices; and pre-trial of the technology before it is actually deployed. The study provides a new model for further research in technology innovation specific to the construction industry.

  2. ANALYSIS OF THE DYNAMICS OF AGGREGATIVE TRANSFER FACTORS OF CESIUM-137 TO MUSHROOMS AFTER THE CHERNOBYL ACCIDENT AS A BASIS FOR CONSTRUCTION PREDICTIVE MODELS

    Directory of Open Access Journals (Sweden)

    K. V. Shilova

    2014-01-01

    Full Text Available In the present paper shows the analysis of the available data on levels of concentrations and aggregative transfer factors (TFag of 137Cs from soil to different species of mushrooms growing in the contaminated areas of the Bryansk region, which is used to improve predictive models for estimation the expected levels and the expected values of TFag and planning of further research.

  3. Models of Economic Analysis

    OpenAIRE

    Adrian Ioana; Tiberiu Socaciu

    2013-01-01

    The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...

  4. Generalised linear mixed models analysis of risk factors for contamination of Danish broiler flocks with Salmonella typhimurium

    DEFF Research Database (Denmark)

    Chriél, Mariann; Stryhn, H.; Dauphin, G.

    1999-01-01

    of rearing, and the sampling method are significant. Epidemiological control would seem most efficient on starting at the top levels of the production hierarchy from which a major part of the ST contamination is derived. A secondary purpose of the study is to evaluate different statistical approaches...... and software for the analysis of a moderately-sized data set of veterinary origin. We compare the results from five analyses of the generalised linear mixed model (GLMM) type. The first observation is that the results agree reasonably well and lead to similar conclusions. A closer look reveals certain patterns...

  5. Analysis of the flood extent extraction model and the natural flood influencing factors: A GIS-based and remote sensing analysis

    Science.gov (United States)

    Lawal, D. U.; Matori, A. N.; Yusuf, K. W.; Hashim, A. M.; Balogun, A. L.

    2014-02-01

    Serious floods have hit the State of Perlis in 2005, 2010, as well as 2011. Perlis is situated in the northern part of Peninsula Malaysia. The floods caused great damage to properties and human lives. There are various methods used in an attempt to provide the most reliable ways to reduce the flood risk and damage to the optimum level by identifying the flood vulnerable zones. The purpose of this paper is to develop a flood extent extraction model based on Minimum Distance Algorithm and to overlay with the natural flood influencing factors considered herein in order to examine the effect of each factor in flood generation. GIS spatial database was created from a geological map, SPOT satellite image, and the topographical map. An attribute database was equally created from field investigations and historical flood areas reports of the study area. The results show a great correlation between the flood extent extraction model and the flood factors.

  6. Analysis of the impact of economic growth factors to resources and environment in Jiangsu Province – Based on Commoner model

    Directory of Open Access Journals (Sweden)

    Zhang Min

    2016-01-01

    Full Text Available In order to response to the increasingly polluted environment, maintain sustainable economic and social development in Jiangsu province, the author calculated the index of the resource environment in Jiangsu, using LMDI(logarithmic-mean Divisia index decomposition method based on the Commoner model(we can see from formula(2,(5,(6&(7, to reflect the three major influencing factors of cumulative effects. In table 2 and figure 3, the research results show the expansion of the size of economy and growth of population make resources consumption increase and environmental pollution aggravate, while technological progress reduce the pressure of resources and environment. According to the findings, the paper proposes the policy recommendations, such as develop circular economy, promote technological innovation and strengthen regional cooperation mechanism and so on to reduce the environmental pollution while economic developing. These will be useful to the policymakers.

  7. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  8. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance-Structure Models to Block-Toeplitz Representing Single-Subject Multivariate Time-Series

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    1998-01-01

    The study of intraindividual variability pervades empirical inquiry in virtually all subdisciplines of psychology. The statistical analysis of multivariate time-series data - a central product of intraindividual investigations - requires special modeling techniques. The dynamic factor model (DFM), w

  9. Evaluating WAIS-IV structure through a different psychometric lens: structural causal model discovery as an alternative to confirmatory factor analysis.

    Science.gov (United States)

    van Dijk, Marjolein J A M; Claassen, Tom; Suwartono, Christiany; van der Veld, William M; van der Heijden, Paul T; Hendriks, Marc P H

    2017-07-20

    Since the publication of the WAIS-IV in the U.S. in 2008, efforts have been made to explore the structural validity by applying factor analysis to various samples. This study aims to achieve a more fine-grained understanding of the structure of the Dutch language version of the WAIS-IV (WAIS-IV-NL) by applying an alternative analysis based on causal modeling in addition to confirmatory factor analysis (CFA). The Bayesian Constraint-based Causal Discovery (BCCD) algorithm learns underlying network structures directly from data and assesses more complex structures than is possible with factor analysis. WAIS-IV-NL profiles of two clinical samples of 202 patients (i.e. patients with temporal lobe epilepsy and a mixed psychiatric outpatient group) were analyzed and contrasted with a matched control group (N = 202) selected from the Dutch standardization sample of the WAIS-IV-NL to investigate internal structure by means of CFA and BCCD. With CFA, the four-factor structure as proposed by Wechsler demonstrates acceptable fit in all three subsamples. However, BCCD revealed three consistent clusters (verbal comprehension, visual processing, and processing speed) in all three subsamples. The combination of Arithmetic and Digit Span as a coherent working memory factor could not be verified, and Matrix Reasoning appeared to be isolated. With BCCD, some discrepancies from the proposed four-factor structure are exemplified. Furthermore, these results fit CHC theory of intelligence more clearly. Consistent clustering patterns indicate these results are robust. The structural causal discovery approach may be helpful in better interpreting existing tests, the development of new tests, and aid in diagnostic instruments.

  10. Advances and challenges in PBPK modeling--Analysis of factors contributing to the oral absorption of atazanavir, a poorly soluble weak base.

    Science.gov (United States)

    Berlin, Mark; Ruff, Aaron; Kesisoglou, Filippos; Xu, Wei; Wang, Michael Hong; Dressman, Jennifer B

    2015-06-01

    Many active pharmaceutical ingredients (APIs) exhibit a highly variable pharmacokinetic (PK) profile. This behavior may be attributable to pre-absorptive, absorptive and/or post-absorptive factors. Pre-absorptive factors are those related to dosage form disintegration, drug dissolution, supersaturation, precipitation and gastric emptying. Absorptive factors are involved with drug absorption and efflux mechanisms, while drug distribution and clearance are post-absorptive factors. This study aimed to investigate the relative influence of the aforementioned parameters on the pharmacokinetic profile of atazanavir, a poorly soluble weakly basic compound with highly variable pharmacokinetics. The pre-absorptive behavior of the drug was examined by applying biorelevant in vitro tests to reflect upper gastrointestinal behavior in the fasted and fed states. The in vitro results were implemented, along with permeability and post-absorptive data obtained from the literature, into physiologically based pharmacokinetic (PBPK) models. Sensitivity analysis of the resulting plasma profiles revealed that the pharmacokinetic profile of atazanavir is affected by an array of factors rather than one standout factor. According to the in silico model, pre-absorptive and absorptive factors had less impact on atazanavir bioavailability compared to post-absorptive parameters, although active drug efflux and extraction appear to account for the sub-proportional pharmacokinetic response to lower atazanavir doses in the fasted state. From the PBPK models it was concluded that further enhancement of the formulation would bring little improvement in the pharmacokinetic response to atazanavir. This approach may prove useful in assessing the potential benefits of formulation enhancement of other existing drug products on the market.

  11. Bayesian Estimation of Categorical Dynamic Factor Models

    Science.gov (United States)

    Zhang, Zhiyong; Nesselroade, John R.

    2007-01-01

    Dynamic factor models have been used to analyze continuous time series behavioral data. We extend 2 main dynamic factor model variations--the direct autoregressive factor score (DAFS) model and the white noise factor score (WNFS) model--to categorical DAFS and WNFS models in the framework of the underlying variable method and illustrate them with…

  12. Spatial Analysis of Dengue Seroprevalence and Modeling of Transmission Risk Factors in a Dengue Hyperendemic City of Venezuela

    Science.gov (United States)

    Vincenti-Gonzalez, Maria F.; Grillet, María-Eugenia; Velasco-Salas, Zoraida I.; Lizarazo, Erley F.; Amarista, Manuel A.; Sierra, Gloria M.; Comach, Guillermo

    2017-01-01

    Background Dengue virus (DENV) transmission is spatially heterogeneous. Hence, to stratify dengue prevalence in space may be an efficacious strategy to target surveillance and control efforts in a cost-effective manner particularly in Venezuela where dengue is hyperendemic and public health resources are scarce. Here, we determine hot spots of dengue seroprevalence and the risk factors associated with these clusters using local spatial statistics and a regression modeling approach. Methodology/Principal Findings From August 2010 to January 2011, a community-based cross-sectional study of 2012 individuals in 840 households was performed in high incidence neighborhoods of a dengue hyperendemic city in Venezuela. Local spatial statistics conducted at household- and block-level identified clusters of recent dengue seroprevalence (39 hot spot households and 9 hot spot blocks) in all neighborhoods. However, no clusters were found for past dengue seroprevalence. Clustering of infection was detected at a very small scale (20-110m) suggesting a high disease focal aggregation. Factors associated with living in a hot spot household were occupation (being a domestic worker/housewife (P = 0.002), lower socio-economic status (living in a shack (P<0.001), sharing a household with <7 people (P = 0.004), promoting potential vector breeding sites (storing water in containers (P = 0.024), having litter outdoors (P = 0.002) and mosquito preventive measures (such as using repellent, P = 0.011). Similarly, low socio-economic status (living in crowded conditions, P<0.001), having an occupation of domestic worker/housewife (P = 0.012) and not using certain preventive measures against mosquitoes (P<0.05) were directly associated with living in a hot spot block. Conclusions/Significance Our findings contribute to a better comprehension of the spatial dynamics of dengue by assessing the relationship between disease clusters and their risk factors. These results can inform health authorities

  13. Spatial Analysis of Dengue Seroprevalence and Modeling of Transmission Risk Factors in a Dengue Hyperendemic City of Venezuela.

    Science.gov (United States)

    Vincenti-Gonzalez, Maria F; Grillet, María-Eugenia; Velasco-Salas, Zoraida I; Lizarazo, Erley F; Amarista, Manuel A; Sierra, Gloria M; Comach, Guillermo; Tami, Adriana

    2017-01-01

    Dengue virus (DENV) transmission is spatially heterogeneous. Hence, to stratify dengue prevalence in space may be an efficacious strategy to target surveillance and control efforts in a cost-effective manner particularly in Venezuela where dengue is hyperendemic and public health resources are scarce. Here, we determine hot spots of dengue seroprevalence and the risk factors associated with these clusters using local spatial statistics and a regression modeling approach. From August 2010 to January 2011, a community-based cross-sectional study of 2012 individuals in 840 households was performed in high incidence neighborhoods of a dengue hyperendemic city in Venezuela. Local spatial statistics conducted at household- and block-level identified clusters of recent dengue seroprevalence (39 hot spot households and 9 hot spot blocks) in all neighborhoods. However, no clusters were found for past dengue seroprevalence. Clustering of infection was detected at a very small scale (20-110m) suggesting a high disease focal aggregation. Factors associated with living in a hot spot household were occupation (being a domestic worker/housewife (P = 0.002), lower socio-economic status (living in a shack (P<0.001), sharing a household with <7 people (P = 0.004), promoting potential vector breeding sites (storing water in containers (P = 0.024), having litter outdoors (P = 0.002) and mosquito preventive measures (such as using repellent, P = 0.011). Similarly, low socio-economic status (living in crowded conditions, P<0.001), having an occupation of domestic worker/housewife (P = 0.012) and not using certain preventive measures against mosquitoes (P<0.05) were directly associated with living in a hot spot block. Our findings contribute to a better comprehension of the spatial dynamics of dengue by assessing the relationship between disease clusters and their risk factors. These results can inform health authorities in the design of surveillance and control activities. Focalizing

  14. Sensitivity analysis, dominant factors, and robustness of the ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5 occupational exposure models.

    Science.gov (United States)

    Riedmann, R A; Gasic, B; Vernez, D

    2015-02-01

    Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART. © 2015 Society for Risk Analysis.

  15. Flash flood susceptibility analysis and its mapping using different bivariate models in Iran: a comparison between Shannon(')s entropy, statistical index, and weighting factor models.

    Science.gov (United States)

    Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh

    2016-12-01

    Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.

  16. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  17. Validating the European Health Literacy Survey Questionnaire in people with type 2 diabetes: Latent trait analyses applying multidimensional Rasch modelling and confirmatory factor analysis.

    Science.gov (United States)

    Finbråten, Hanne Søberg; Pettersen, Kjell Sverre; Wilde-Larsson, Bodil; Nordström, Gun; Trollvik, Anne; Guttersrud, Øystein

    2017-11-01

    To validate the European Health Literacy Survey Questionnaire (HLS-EU-Q47) in people with type 2 diabetes mellitus. The HLS-EU-Q47 latent variable is outlined in a framework with four cognitive domains integrated in three health domains, implying 12 theoretically defined subscales. Valid and reliable health literacy measurers are crucial to effectively adapt health communication and education to individuals and groups of patients. Cross-sectional study applying confirmatory latent trait analyses. Using a paper-and-pencil self-administered approach, 388 adults responded in March 2015. The data were analysed using the Rasch methodology and confirmatory factor analysis. Response violation (response dependency) and trait violation (multidimensionality) of local independence were identified. Fitting the "multidimensional random coefficients multinomial logit" model, 1-, 3- and 12-dimensional Rasch models were applied and compared. Poor model fit and differential item functioning were present in some items, and several subscales suffered from poor targeting and low reliability. Despite multidimensional data, we did not observe any unordered response categories. Interpreting the domains as distinct but related latent dimensions, the data fit a 12-dimensional Rasch model and a 12-factor confirmatory factor model best. Therefore, the analyses did not support the estimation of one overall "health literacy score." To support the plausibility of claims based on the HLS-EU score(s), we suggest: removing the health care aspect to reduce the magnitude of multidimensionality; rejecting redundant items to avoid response dependency; adding "harder" items and applying a six-point rating scale to improve subscale targeting and reliability; and revising items to improve model fit and avoid bias owing to person factors. © 2017 John Wiley & Sons Ltd.

  18. Structural modeling and DNA binding autoinhibition analysis of Ergp55, a critical transcription factor in prostate cancer.

    Directory of Open Access Journals (Sweden)

    Shanti P Gangwar

    Full Text Available BACKGROUND: The Ergp55 protein belongs to Ets family of transcription factor. The Ets proteins are highly conserved in their DNA binding domain and involved in various development processes and regulation of cancer metabolism. To study the structure and DNA binding autoinhibition mechanism of Ergp55 protein, we have produced full length and smaller polypeptides of Ergp55 protein in E. coli and characterized using various biophysical techniques. RESULTS: The Ergp55 polypeptides contain large amount of α-helix and random coil structures as measured by circular dichorism spectroscopy. The full length Ergp55 forms a flexible and elongated molecule as revealed by molecular modeling, dynamics simulation and structural prediction algorithms. The binding analyses of Ergp55 polypeptides with target DNA sequences of E74 and cfos promoters indicate that longer fragments of Ergp55 (beyond the Ets domain showed the evidence of auto-inhibition. This study also revealed the parts of Ergp55 protein that mediate auto-inhibition. SIGNIFICANCE: The current study will aid in designing the compounds that stabilize the inhibited form of Ergp55 and inhibit its binding to promoter DNA. It will contribute in the development of drugs targeting Ergp55 for the prostate cancer treatment.

  19. Evaluating the influence of physical, economic and managerial factors on sheet erosion in rangelands of SW Spain by performing a sensitivity analysis on an integrated dynamic model.

    Science.gov (United States)

    Ibáñez, J; Lavado Contador, J F; Schnabel, S; Martínez Valderrama, J

    2016-02-15

    An integrated dynamic model was used to evaluate the influence of climatic, soil, pastoral, economic and managerial factors on sheet erosion in rangelands of SW Spain (dehesas). This was achieved by means of a variance-based sensitivity analysis. Topsoil erodibility, climate change and a combined factor related to soil water storage capacity and the pasture production function were the factors which influenced water erosion the most. Of them, climate change is the main source of uncertainty, though in this study it caused a reduction in the mean and the variance of long-term erosion rates. The economic and managerial factors showed scant influence on soil erosion, meaning that it is unlikely to find such influence in the study area for the time being. This is because the low profitability of the livestock business maintains stocking rates at low levels. However, the potential impact of livestock, through which economic and managerial factors affect soil erosion, proved to be greater in absolute value than the impact of climate change. Therefore, if changes in some economic or managerial factors led to higher stocking rates in the future, significant increases in erosion rates would be expected.

  20. Safety Evaluation Model of Airport Operation Based on Factor Analysis%基于因子分析的机场运行安全评价模型

    Institute of Scientific and Technical Information of China (English)

    靳慧斌; 洪远; 蔡亚敏

    2015-01-01

    针对民用机场安全评估方法中存在的主观性和共线性问题,文章在利用人为因素分析与分类系统(HFACS )方法分析机场运行不安全事件 ,客观地提出安全评价指标基础上 ,采用因子分析方法建立综合评价因子 ,解决了各评价指标彼此间存在的共线性问题 ,并结合方差赋权法构建安全评价模型.以国内某大型机场为评价对象 ,将因子分析方法评价结果与模糊评价方法所得结果进行相关分析,验证了评价方法的有效性.%This paper uses human factors analysis and classification system to analyze unsafe events which has happened in airport operational field and puts forward safety evaluation index system .Then the factor analysis method is used to build several comprehensive evaluation factors ,information overlapping among all indexes has been solve and based on those fac-tors and variance weighting method ,a safety evaluation model is constructed .Finally ,taking one of domestic airports as an example ,the results from factors analysis and fuzzy evaluation method are conducted correlative analysis and its effectiveness is testified .

  1. Multiple dimensions of health locus of control in a representative population sample: ordinal factor analysis and cross-validation of an existing three and a new four factor model

    Directory of Open Access Journals (Sweden)

    Hapke Ulfert

    2011-08-01

    Full Text Available Abstract Background Based on the general approach of locus of control, health locus of control (HLOC concerns control-beliefs due to illness, sickness and health. HLOC research results provide an improved understanding of health related behaviour and patients' compliance in medical care. HLOC research distinguishes between beliefs due to Internality, Externality powerful Others (POs and Externality Chance. However, evidences for differentiating the POs dimension were found. Previous factor analyses used selected and predominantly clinical samples, while non-clinical studies are rare. The present study is the first analysis of the HLOC structure based on a large representative general population sample providing important information for non-clinical research and public health care. Methods The standardised German questionnaire which assesses HLOC was used in a representative adult general population sample for a region in Northern Germany (N = 4,075. Data analyses used ordinal factor analyses in LISREL and Mplus. Alternative theory-driven models with one to four latent variables were compared using confirmatory factor analysis. Fit indices, chi-square difference tests, residuals and factor loadings were considered for model comparison. Exploratory factor analysis was used for further model development. Results were cross-validated splitting the total sample randomly and using the cross-validation index. Results A model with four latent variables (Internality, Formal Help, Informal Help and Chance best represented the HLOC construct (three-dimensional model: normed chi-square = 9.55; RMSEA = 0.066; CFI = 0.931; SRMR = 0.075; four-dimensional model: normed chi-square = 8.65; RMSEA = 0.062; CFI = 0.940; SRMR = 0.071; chi-square difference test: p Conclusions Future non-clinical HLOC studies in western cultures should consider four dimensions of HLOC: Internality, Formal Help, Informal Help and Chance. However, the standardised German instrument

  2. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  3. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  4. Full Information Item Factor Analysis of the FCI

    Science.gov (United States)

    Hagedorn, Eric

    2010-02-01

    Traditional factor analytical methods, principal factors or principal components analysis, are inappropriate techniques for analyzing dichotomously scored responses to standardized tests or concept inventories because they lead to artifactual factors often referred to as ``difficulty factors.'' Full information item factor analysis (Bock, Gibbons and Muraki, 1988) based on Thurstone's multiple factor model and calculated using marginal maximum likelihood estimation, is an appropriate technique for such analyses. Force Concept Inventory (Hestenes, Wells and Swackhamer, 1992) data from 1582 university students completing an introductory physics course, was analyzed using the full information item factor analysis software TESTFACT v. 4. Analyzing the statistical significance of successive factors added to the model, using chi-squared statistics, led to a six factor model interpretable in terms of the conceptual dimensions of the FCI. )

  5. Ranking factors involved in product design using a hybrid model of Quality Function Deployment, Data Envelopment Analysis and TOPSIS technique

    Directory of Open Access Journals (Sweden)

    Davood Feiz

    2014-08-01

    Full Text Available Quality function deployment (QFD is one such extremely important quality management tool, which is useful in product design and development. Traditionally, QFD rates the design requirements (DRs with respect to customer requirements, and aggregates the rating to get relative importance score of DRs. An increasing number of studies emphasize on the need to incorporate additional factors, such as cost and environmental impact, while calculating the relative importance of DRs. However, there are different methodologies for driving the relative importance of DRs, when several additional factors are considered. TOPSIS (technique for order preferences by similarity to ideal solution is suggested for the purpose of the research. This research proposes new approach of TOPSIS for considering the rating of DRs with respect to CRs, and several additional factors, simultaneously. Proposed method is illustrated using by step-by-step procedure. The proposed methodology was applied for the Sanam Electronic Company in Iran.

  6. DSM-5 alternative personality disorder model traits as maladaptive extreme variants of the five-factor model: An item-response theory analysis.

    Science.gov (United States)

    Suzuki, Takakuni; Samuel, Douglas B; Pahlen, Shandell; Krueger, Robert F

    2015-05-01

    Over the past two decades, evidence has suggested that personality disorders (PDs) can be conceptualized as extreme, maladaptive variants of general personality dimensions, rather than discrete categorical entities. Recognizing this literature, the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) alternative PD model in Section III defines PDs partially through 25 maladaptive traits that fall within 5 domains. Empirical evidence based on the self-report measure of these traits, the Personality Inventory for DSM-5 (PID-5), suggests that these five higher-order domains share a structure and correlate in meaningful ways with the five-factor model (FFM) of general personality. In the current study, item response theory was used to compare the DSM-5 alternative PD model traits to those from a normative FFM inventory (the International Personality Item Pool-NEO [IPIP-NEO]) in terms of their measurement precision along the latent dimensions. Within a combined sample of 3,517 participants, results strongly supported the conclusion that the DSM-5 alternative PD model traits and IPIP-NEO traits are complimentary measures of 4 of the 5 FFM domains (with perhaps the exception of openness to experience vs. psychoticism). Importantly, the two measures yield largely overlapping information curves on these four domains. Differences that did emerge suggested that the PID-5 scales generally have higher thresholds and provide more information at the upper levels, whereas the IPIP-NEO generally had an advantage at the lower levels. These results support the general conceptualization that 4 domains of the DSM-5 alternative PD model traits are maladaptive, extreme versions of the FFM. (PsycINFO Database Record

  7. An environmental analysis of genes associated with schizophrenia: hypoxia and vascular factors as interacting elements in the neurodevelopmental model.

    Science.gov (United States)

    Schmidt-Kastner, R; van Os, J; Esquivel, G; Steinbusch, H W M; Rutten, B P F

    2012-12-01

    Investigating and understanding gene-environment interaction (G × E) in a neurodevelopmentally and biologically plausible manner is a major challenge for schizophrenia research. Hypoxia during neurodevelopment is one of several environmental factors related to the risk of schizophrenia, and links between schizophrenia candidate genes and hypoxia regulation or vascular expression have been proposed. Given the availability of a wealth of complex genetic information on schizophrenia in the literature without knowledge on the connections to environmental factors, we now systematically collected genes from candidate studies (using SzGene), genome-wide association studies (GWAS) and copy number variation (CNV) analyses, and then applied four criteria to test for a (theoretical) link to ischemia-hypoxia and/or vascular factors. In all, 55% of the schizophrenia candidate genes (n=42 genes) met the criteria for a link to ischemia-hypoxia and/or vascular factors. Genes associated with schizophrenia showed a significant, threefold enrichment among genes that were derived from microarray studies of the ischemia-hypoxia response (IHR) in the brain. Thus, the finding of a considerable match between genes associated with the risk of schizophrenia and IHR and/or vascular factors is reproducible. An additional survey of genes identified by GWAS and CNV analyses suggested novel genes that match the criteria. Findings for interactions between specific variants of genes proposed to be IHR and/or vascular factors with obstetric complications in patients with schizophrenia have been reported in the literature. Therefore, the extended gene set defined here may form a reasonable and evidence-based starting point for hypothesis-based testing of G × E interactions in clinical genetic and translational neuroscience studies.

  8. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  9. Brooding and Pondering: Isolating the Active Ingredients of Depressive Rumination with Exploratory Factor Analysis and Structural Equation Modeling

    Science.gov (United States)

    Armey, Michael F.; Fresco, David M.; Moore, Michael T.; Mennin, Douglas S.; Turk, Cynthia L.; Heimberg, Richard G.; Kecmanovic, Jelena; Alloy, Lauren B.

    2009-01-01

    Depressive rumination, as assessed by Nolen-Hoeksema's Response Styles Questionnaire (RSQ), predicts the onset, chronicity, and duration of depressed mood. However, some RSQ items contain depressive content and result in a heterogeneous factor structure. After the a priori elimination of items potentially confounded with depressed item content,…

  10. Evaluation of chemical transport model predictions of primary organic aerosol for air masses classified by particle component-based factor analysis

    Directory of Open Access Journals (Sweden)

    C. A. Stroud

    2012-09-01

    Full Text Available Observations from the 2007 Border Air Quality and Meteorology Study (BAQS-Met 2007 in Southern Ontario, Canada, were used to evaluate predictions of primary organic aerosol (POA and two other carbonaceous species, black carbon (BC and carbon monoxide (CO, made for this summertime period by Environment Canada's AURAMS regional chemical transport model. Particle component-based factor analysis was applied to aerosol mass spectrometer measurements made at one urban site (Windsor, ON and two rural sites (Harrow and Bear Creek, ON to derive hydrocarbon-like organic aerosol (HOA factors. A novel diagnostic model evaluation was performed by investigating model POA bias as a function of HOA mass concentration and indicator ratios (e.g. BC/HOA. Eight case studies were selected based on factor analysis and back trajectories to help classify model bias for certain POA source types. By considering model POA bias in relation to co-located BC and CO biases, a plausible story is developed that explains the model biases for all three species.

    At the rural sites, daytime mean PM1 POA mass concentrations were under-predicted compared to observed HOA concentrations. POA under-predictions were accentuated when the transport arriving at the rural sites was from the Detroit/Windsor urban complex and for short-term periods of biomass burning influence. Interestingly, the daytime CO concentrations were only slightly under-predicted at both rural sites, whereas CO was over-predicted at the urban Windsor site with a normalized mean bias of 134%, while good agreement was observed at Windsor for the comparison of daytime PM1 POA and HOA mean values, 1.1 μg m−3 and 1.2 μg m−3, respectively. Biases in model POA predictions also trended from positive to negative with increasing HOA values. Periods of POA over-prediction were most evident at the urban site on calm nights due to an overly-stable model surface layer

  11. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  12. The Five Cs Model of Positive Youth Development: A Longitudinal Analysis of Confirmatory Factor Structure and Measurement Invariance

    Science.gov (United States)

    Bowers, Edmond P.; Li, Yibing; Kiely, Megan K.; Brittian, Aerika; Lerner, Jacqueline V.; Lerner, Richard M.

    2010-01-01

    The understanding of positive development across adolescence rests on having a valid and equivalent measure of this construct across the breadth of this period of life. Does the Positive Youth Development (PYD) construct based on the Five Cs model have satisfactory psychometric properties for such longitudinal measurement invariance? Using…

  13. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  14. Analytic standard errors for exploratory process factor analysis.

    Science.gov (United States)

    Zhang, Guangjian; Browne, Michael W; Ong, Anthony D; Chow, Sy Miin

    2014-07-01

    Exploratory process factor analysis (EPFA) is a data-driven latent variable model for multivariate time series. This article presents analytic standard errors for EPFA. Unlike standard errors for exploratory factor analysis with independent data, the analytic standard errors for EPFA take into account the time dependency in time series data. In addition, factor rotation is treated as the imposition of equality constraints on model parameters. Properties of the analytic standard errors are demonstrated using empirical and simulated data.

  15. Modeling Relational Data via Latent Factor Blockmodel

    CERN Document Server

    Gao, Sheng; Gallinari, Patrick

    2012-01-01

    In this paper we address the problem of modeling relational data, which appear in many applications such as social network analysis, recommender systems and bioinformatics. Previous studies either consider latent feature based models but disregarding local structure in the network, or focus exclusively on capturing local structure of objects based on latent blockmodels without coupling with latent characteristics of objects. To combine the benefits of the previous work, we propose a novel model that can simultaneously incorporate the effect of latent features and covariates if any, as well as the effect of latent structure that may exist in the data. To achieve this, we model the relation graph as a function of both latent feature factors and latent cluster memberships of objects to collectively discover globally predictive intrinsic properties of objects and capture latent block structure in the network to improve prediction performance. We also develop an optimization transfer algorithm based on the general...

  16. Source Apportionment of Ambient PM10 in the Urban Area of Longyan City, China: a Comparative Study Based on Chemical Mass Balance Model and Factor Analysis Method

    Institute of Scientific and Technical Information of China (English)

    QIU Li-min; LIU Miao; WANG Ju; ZHANG Sheng-nan; FANG Chun-sheng

    2012-01-01

    In order to identify the day and night pollution sources of PM10 in ambient air in Longyan City,the authors analyzed the elemental composition of respirable particulate matters in the day and night ambient air samples and various pollution sources which were collected in January 2010 in Longyan with inductivity coupled plasma-mass spectrometry(ICP-MS).Then chemical mass balance(CMB)model and factor analysis(FA)method were applied to comparatively study the inorganic components in the sources and receptor samples.The results of factor analysis show that the major sources were road dust,waste incineration and mixed sources which contained automobile exhaust,soil dust/secondary dust and coal dust during the daytime in Longyan City,China.There are two major sources of pollution which are soil dust and mixture sources of automobile exhaust and secondary dust during the night in Longyan.The results of CMB show that the major sources are secondary dust,automobile exhaust and road dust during the daytime in Longyan.The major sources are secondary dust,soil dust and automobile exhaust during the night in Longyan.The results of the two methods are similar to each other and the results will guide us to plan to control the PM10 pollution sources in Longyan.

  17. Frailty Models in Survival Analysis

    CERN Document Server

    Wienke, Andreas

    2010-01-01

    The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. "Frailty Models in Survival Analysis" presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of

  18. Robust and Sparse Factor Modelling

    DEFF Research Database (Denmark)

    Croux, Christophe; Exterkate, Peter

    Factor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having relatively few...... nonzero factor loadings. Compared to the traditional factor construction method, we find that this procedure leads to a favorable forecasting performance in the presence of outliers and to better interpretable factors. We investigate the performance of the method in a Monte Carlo experiment...

  19. Components of quality of life in a sample of patients with lupus: a confirmatory factor analysis and Rasch modeling of the LupusQoL.

    Science.gov (United States)

    Meseguer-Henarejos, Ana-Belén; Gascón-Cánovas, Juan-José; López-Pina, José-Antonio

    2017-08-01

    The objective of this study was to test different exploratory solutions to the LupusQoL scale in a sample of Spanish patients with SLE using confirmatory factor analysis (CFA) and Rasch modeling, as well as to estimate the convergent validity. The χ (2) test, RMSEA, CFI, and TLI were used to test the fit of the different exploratory structures with CFA. To estimate the parameters in the dimensions found, a rating scale Rasch multidimensional random coefficient multinomial logit model was used. The reliability of the scores was obtained with coefficient alpha and coefficient omega. The convergent validity was calculated using Spearman's rho. Four hundred and fifty patients participated but complete data were available for 223 subjects. The original version (UK) and the French version obtained the best fit, showing that the proposed original structure was the best solution for the structure of the LupusQoL scale in the Spanish sample. The multidimensional solution of eight dimensions was adequate, but item 8 in physical health, item 16 in intimate relations, and items 29 and 30 obtained mean squares >1.6. Internal consistency and coefficient omega of the scores in the eight domains were higher. The Spanish version of LupusQoL correlated strongly with the corresponding SLAQ, EQ5D analogic scale, and EQ5D domain. This analysis confirmed the structure of eight dimensions of the original version in patients with SLE.

  20. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  1. Competing Factor Models of Child and Adolescent Psychopathology.

    Science.gov (United States)

    Doyle, Mark M; Murphy, Jamie; Shevlin, Mark

    2016-11-01

    Co-occurring psychological disorders are highly prevalent among children and adolescents. To date, the most widely utilised factor model used to explain this co-occurrence is the two factor model of internalising and externalising (Achenbach 1966). Several competing models of general psychopathology have since been reported as alternatives, including a recent three factor model of Distress, Fear and Externalising Dimensions (Krueger 1999). Evidence for the three factor model suggests there are advantages to utilising a more complex model. Using the British Child and Adolescent Mental Health Survey 2004 data (B-CAMHS; N = 7997), confirmatory factor analysis was used to test competing factor structure models of child and adolescent psychopathology. The B-CAMHS was an epidemiological survey of children between the ages of 5 and 16 in Great Britain. Child psychological disorders were assessed using the Strength and Difficulties Questionnaire (Goodman 1997), and the Development and Wellbeing Assessment (Goodman et al. 2000). A range of covariates and risk variables including trauma, parent mental health and family functioning where subsequently utilised within a MIMIC model framework to predict each dimension of the 2 and three factor structure models. Two models demonstrated acceptable fit. The first complimented Achenbach's Internalising and Externalising structure. The three factor model was found to have highly comparable fit indices to the two factor model. The second order models did not accurately represent the data nor did an alternative three factor model of Internalising, Externalising and ADHD. The two factor and three factor MIMIC models observed unique profiles of risk for each dimension. The findings suggest that child and adolescent psychopathology may also be accurately conceptualised in terms of distress, fear and externalising dimensions. The MIMIC models demonstrated that the Distress and Fear dimensions have their own unique etiological profile of

  2. Analysis of CD45- [CD34+/KDR+] endothelial progenitor cells as juvenile protective factors in a rat model of ischemic-hemorrhagic stroke.

    Directory of Open Access Journals (Sweden)

    Julius L Decano

    Full Text Available BACKGROUND: Identification of juvenile protective factors (JPFs which are altered with age and contribute to adult-onset diseases could identify novel pathways for reversing the effects of age, an accepted non-modifiable risk factor to adult-onset diseases. Since endothelial progenitor cells (EPCs have been observed to be altered in stroke, hypertension and hypercholesterolemia, said EPCs are candidate JPFs for adult-onset stroke. A priori, if EPC aging plays a 'master-switch JPF-role' in stroke pathogenesis, juvenile EPC therapy alone should delay stroke-onset. Using a hypertensive, transgenic-hyperlipidemic rat model of spontaneous ischemic-hemorrhagic stroke, spTg25, we tested the hypothesis that freshly isolated juvenile EPCs are JPFs that can attenuate stroke progression and delay stroke onset. METHODOLOGY/PRINCIPAL FINDINGS: FACS analysis revealed that CD45- [CD34+/KDR+] EPCs decrease with progression to stroke in spTg25 rats, exhibit differential expression of the dual endodthelin-1/VEGFsp receptor (DEspR and undergo differential DEspR-subtype specific changes in number and in vitro angiogenic tube-incorporation. In vivo EPC infusion of male, juvenile non-expanded cd45-[CD34+/KDR+] EPCs into female stroke-prone rats prior to stroke attenuated progression and delayed stroke onset (P<0.003. Detection of Y-chromosome DNA in brain microvessels of EPC-treated female spTg25 rats indicates integration of male EPCs into female rat brain microvessels. Gradient-echo MRI showed delay of ischemic-hemorrhagic lesions in EPC-treated rats. Real-time RT-PCR pathway-specific array-analysis revealed age-associated gene expression changes in CD45-[CD34+/KDR]EPC subtypes, which were accelerated in stroke-prone rats. Pro-angiogenic genes implicated in intimal hyperplasia were increased in stroke-prone rat EPCs (P<0.0001, suggesting a maladaptive endothelial repair system which acts like a double-edged sword repairing while predisposing to age

  3. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  4. Evaluation of chemical transport model predictions of primary organic aerosol for air masses classified by particle-component-based factor analysis

    Directory of Open Access Journals (Sweden)

    C. A. Stroud

    2012-02-01

    Full Text Available Observations from the 2007 Border Air Quality and Meteorology Study (BAQS-Met 2007 in southern Ontario (ON, Canada, were used to evaluate Environment Canada's regional chemical transport model predictions of primary organic aerosol (POA. Environment Canada's operational numerical weather prediction model and the 2006 Canadian and 2005 US national emissions inventories were used as input to the chemical transport model (named AURAMS. Particle-component-based factor analysis was applied to aerosol mass spectrometer measurements made at one urban site (Windsor, ON and two rural sites (Harrow and Bear Creek, ON to derive hydrocarbon-like organic aerosol (HOA factors. Co-located carbon monoxide (CO, PM2.5 black carbon (BC, and PM1 SO4 measurements were also used for evaluation and interpretation, permitting a detailed diagnostic model evaluation.

    At the urban site, good agreement was observed for the comparison of daytime campaign PM1 POA and HOA mean values: 1.1 μg m−3 vs. 1.2 μg m−3, respectively. However, a POA overprediction was evident on calm nights due to an overly-stable model surface layer. Biases in model POA predictions trended from positive to negative with increasing HOA values. This trend has several possible explanations, including (1 underweighting of urban locations in particulate matter (PM spatial surrogate fields, (2 overly-coarse model grid spacing for resolving urban-scale sources, and (3 lack of a model particle POA evaporation process during dilution of vehicular POA tail-pipe emissions to urban scales. Furthermore, a trend in POA bias was observed at the urban site as a function of the BC/HOA ratio, suggesting a possible association of POA underprediction for diesel combustion sources. For several time periods, POA overprediction was also observed for sulphate-rich plumes, suggesting that our model POA fractions for the PM2.5 chemical

  5. Multigroup Confirmatory Factor Analysis: Locating the Invariant Referent Sets

    Science.gov (United States)

    French, Brian F.; Finch, W. Holmes

    2008-01-01

    Multigroup confirmatory factor analysis (MCFA) is a popular method for the examination of measurement invariance and specifically, factor invariance. Recent research has begun to focus on using MCFA to detect invariance for test items. MCFA requires certain parameters (e.g., factor loadings) to be constrained for model identification, which are…

  6. Analytic Couple Modeling Introducing Device Design Factor, Fin Factor, Thermal Diffusivity Factor, and Inductance Factor

    Science.gov (United States)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    A set of convenient thermoelectric device solutions have been derived in order to capture a number of factors which are previously only resolved with numerical techniques. The concise conversion efficiency equations derived from governing equations provide intuitive and straight-forward design guidelines. These guidelines allow for better device design without requiring detailed numerical modeling. The analytical modeling accounts for factors such as i) variable temperature boundary conditions, ii) lateral heat transfer, iii) temperature variable material properties, and iv) transient operation. New dimensionless parameters, similar to the figure of merit, are introduced including the device design factor, fin factor, thermal diffusivity factor, and inductance factor. These new device factors allow for the straight-forward description of phenomenon generally only captured with numerical work otherwise. As an example a device design factor of 0.38, which accounts for thermal resistance of the hot and cold shoes, can be used to calculate a conversion efficiency of 2.28 while the ideal conversion efficiency based on figure of merit alone would be 6.15. Likewise an ideal couple with efficiency of 6.15 will be reduced to 5.33 when lateral heat is accounted for with a fin factor of 1.0.

  7. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  8. Exploratory Tobit factor analysis for multivariate censored data

    NARCIS (Netherlands)

    Kamakura, WA; Wedel, M

    2001-01-01

    We propose Multivariate Tobit models with a factor structure on the covariance matrix. Such models are particularly useful in the exploratory analysis of multivariate censored data and the identification of latent variables from behavioral data. The factor structure provides a parsimonious

  9. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  10. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...

  11. Socio-Economic Factors Affecting Adoption of Modern Information and Communication Technology by Farmers in India: Analysis Using Multivariate Probit Model

    Science.gov (United States)

    Mittal, Surabhi; Mehar, Mamta

    2016-01-01

    Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…

  12. Shape Factor Modeling and Simulation

    Science.gov (United States)

    2016-06-01

    10 3. Shape Factor Distributions for Natural Fragments 12 3.1 Platonic Solids and Uniform Viewing from All Viewpoints 12 3.2 Natural Fragments from...12 Fig. 9 The 5 Platonic solids. ............................................................. 12 Fig. 10 Mean shape factor of...of the 5 Platonic solids............................................ 13 Table 3 Sequence of viewing angles in Icosahedron Gage

  13. Pilots’occupational stress analysis model based on factor analysis%基于因子分析的引航员职业压力分析模型

    Institute of Scientific and Technical Information of China (English)

    骆杭飞; 肖英杰; 周伟; 白响恩

    2015-01-01

    To provide some measures for reducing the pilots’occupational stress,the factor analysis is applied to the analysis on the occupational stress. The influencing factors resulting in the pilots’occupa-tional stress are analyzed in the nature of work,job involvement,job fatigue and job performance of pi-lots. The corresponding index system model is constructed. Data are collected by questionnaires,and the main stress factors are determined by SPSS 20. 0. According to the main factors,the strategies are estab-lished to reduce the adverse effects caused by stress factors. It provides reference for the relief of pilots’ psychological stress through the thought of dimension reduction of factor analysis.%为提出减轻引航员职业压力的措施,将因子分析运用到引航员职业压力分析中.从引航员工作性质、工作投入、工作疲劳和工作绩效入手,分析造成引航员职业压力的影响因子,构建引航员职业压力指标体系模型.通过问卷调查采集数据,利用SPSS 20.0确定引起引航员职业压力的主要因子.依据主要因子,制定降低由这些因子产生的不良影响的策略.利用因子分析的降维思想,可为减轻引航员心理压力提供参考.

  14. Skewed factor models using selection mechanisms

    KAUST Repository

    Kim, Hyoung-Moon

    2015-12-21

    Traditional factor models explicitly or implicitly assume that the factors follow a multivariate normal distribution; that is, only moments up to order two are involved. However, it may happen in real data problems that the first two moments cannot explain the factors. Based on this motivation, here we devise three new skewed factor models, the skew-normal, the skew-tt, and the generalized skew-normal factor models depending on a selection mechanism on the factors. The ECME algorithms are adopted to estimate related parameters for statistical inference. Monte Carlo simulations validate our new models and we demonstrate the need for skewed factor models using the classic open/closed book exam scores dataset.

  15. Modeling Ability Differentiation in the Second-Order Factor Model

    Science.gov (United States)

    Molenaar, Dylan; Dolan, Conor V.; van der Maas, Han L. J.

    2011-01-01

    In this article we present factor models to test for ability differentiation. Ability differentiation predicts that the size of IQ subtest correlations decreases as a function of the general intelligence factor. In the Schmid-Leiman decomposition of the second-order factor model, we model differentiation by introducing heteroscedastic residuals,…

  16. Modeling ability differentiation in the second-order factor model

    NARCIS (Netherlands)

    Molenaar, D.; Dolan, C.V.; van der Maas, H.L.J.

    2011-01-01

    In this article we present factor models to test for ability differentiation. Ability differentiation predicts that the size of IQ subtest correlations decreases as a function of the general intelligence factor. In the Schmid-Leiman decomposition of the second-order factor model, we model

  17. Statistical Mechanical Models of Integer Factorization Problem

    Science.gov (United States)

    Nakajima, Chihiro H.; Ohzeki, Masayuki

    2017-01-01

    We formulate the integer factorization problem via a formulation of the searching problem for the ground state of a statistical mechanical Hamiltonian. The first passage time required to find a correct divisor of a composite number signifies the exponential computational hardness. The analysis of the density of states of two macroscopic quantities, i.e., the energy and the Hamming distance from the correct solutions, leads to the conclusion that the ground state (correct solution) is completely isolated from the other low-energy states, with the distance being proportional to the system size. In addition, the profile of the microcanonical entropy of the model has two peculiar features that are each related to two marked changes in the energy region sampled via Monte Carlo simulation or simulated annealing. Hence, we find a peculiar first-order phase transition in our model.

  18. THE SAFETY FACTOR ANALYSIS OF THE MARINE SLOPE STABILITY MODEL ON THE ACCESS CHANNEL OF MARINE CENTRE PLAN CIREBON, WEST JAVA.

    Directory of Open Access Journals (Sweden)

    Franto Novico

    2017-07-01

    Full Text Available This study is focused on access channel model that safety factors of some slopes stability would be investigated. Plaxis version 8 is applied to analyze a magnitude of safety factors and displacements based on three different slopes of access channel there are 30°, 45° and 60°. Furthermore, parameters are adopted from geotechnical drilling and laboratory tests. A finite element is applied as a simple model to analyze within a Mohr-coulomb equation. Based on soil data analyses on Marine Center Plan, indicates low safety factor and high deformation. As results, 10 to 40 meters deformation of the slopes and 0.80 to 2.34 of safety factor are obtained of the models. For that reason, a combination between slope channel and infrastructure must be considered.

  19. Model-free data analysis for source separation based on Non-Negative Matrix Factorization and k-means clustering (NMFk)

    Science.gov (United States)

    Vesselinov, V. V.; Alexandrov, B.

    2014-12-01

    The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the

  20. Dynamic Factor Models for the Volatility Surface

    DEFF Research Database (Denmark)

    van der Wel, Michel; Ozturk, Sait R.; Dijk, Dick van

    The implied volatility surface is the collection of volatilities implied by option contracts for different strike prices and time-to-maturity. We study factor models to capture the dynamics of this three-dimensional implied volatility surface. Three model types are considered to examine desirable...... features for representing the surface and its dynamics: a general dynamic factor model, restricted factor models designed to capture the key features of the surface along the moneyness and maturity dimensions, and in-between spline-based methods. Key findings are that: (i) the restricted and spline......-based models are both rejected against the general dynamic factor model, (ii) the factors driving the surface are highly persistent, (iii) for the restricted models option Delta is preferred over the more often used strike relative to spot price as measure for moneyness....

  1. Cardinality constrained portfolio selection via factor models

    OpenAIRE

    Monge, Juan Francisco

    2017-01-01

    In this paper we propose and discuss different 0-1 linear models in order to solve the cardinality constrained portfolio problem by using factor models. Factor models are used to build portfolios to track indexes, together with other objectives, also need a smaller number of parameters to estimate than the classical Markowitz model. The addition of the cardinality constraints limits the number of securities in the portfolio. Restricting the number of securities in the portfolio allows us to o...

  2. Statistical inference of Minimum Rank Factor Analysis

    NARCIS (Netherlands)

    Shapiro, A; Ten Berge, JMF

    2002-01-01

    For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an observed covariance matrix in the sense that the unexplained common variance with that number of factors is minimized, subject to the constraint that both the diagonal matrix of unique variances and the

  3. Statistical inference of Minimum Rank Factor Analysis

    NARCIS (Netherlands)

    Shapiro, A; Ten Berge, JMF

    For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an observed covariance matrix in the sense that the unexplained common variance with that number of factors is minimized, subject to the constraint that both the diagonal matrix of unique variances and the

  4. Seeing Perfectly Fitting Factor Models That Are Causally Misspecified: Understanding That Close-Fitting Models Can Be Worse

    Science.gov (United States)

    Hayduk, Leslie

    2014-01-01

    Researchers using factor analysis tend to dismiss the significant ill fit of factor models by presuming that if their factor model is close-to-fitting, it is probably close to being properly causally specified. Close fit may indeed result from a model being close to properly causally specified, but close-fitting factor models can also be seriously…

  5. Seeing Perfectly Fitting Factor Models That Are Causally Misspecified: Understanding That Close-Fitting Models Can Be Worse

    Science.gov (United States)

    Hayduk, Leslie

    2014-01-01

    Researchers using factor analysis tend to dismiss the significant ill fit of factor models by presuming that if their factor model is close-to-fitting, it is probably close to being properly causally specified. Close fit may indeed result from a model being close to properly causally specified, but close-fitting factor models can also be seriously…

  6. Personalized Predictive Modeling and Risk Factor Identification using Patient Similarity.

    Science.gov (United States)

    Ng, Kenney; Sun, Jimeng; Hu, Jianying; Wang, Fei

    2015-01-01

    Personalized predictive models are customized for an individual patient and trained using information from similar patients. Compared to global models trained on all patients, they have the potential to produce more accurate risk scores and capture more relevant risk factors for individual patients. This paper presents an approach for building personalized predictive models and generating personalized risk factor profiles. A locally supervised metric learning (LSML) similarity measure is trained for diabetes onset and used to find clinically similar patients. Personalized risk profiles are created by analyzing the parameters of the trained personalized logistic regression models. A 15,000 patient data set, derived from electronic health records, is used to evaluate the approach. The predictive results show that the personalized models can outperform the global model. Cluster analysis of the risk profiles show groups of patients with similar risk factors, differences in the top risk factors for different groups of patients and differences between the individual and global risk factors.

  7. PROGNOSTIC FACTORS ANALYSIS FOR STAGEⅠ RECTAL CANCER

    Institute of Scientific and Technical Information of China (English)

    武爱文; 顾晋; 薛钟麒; 王怡; 徐光炜

    2001-01-01

    To explore the death-related factors of stageⅠrectal cancer patients. Methods: 89 cases of stage I rectal cancer patients between 1985 and 2000 were retrospectively studied for prognostic factors. Factors including age, gender, tumor size, circumferential occupation, gross type, pathological type, depth of tumor invasion, surgical procedure, adjuvant chemotherapy and postoperative complication were chosen for cox multivariate analysis (forward procedure) using Spss software (10.0 version). Results: multivariate analysis demonstrated that muscular invasion was an independent negative prognostic factor for stageⅠrectal cancer patients (P=0.003). Conclusion: Muscular invasion is a negative prognostic factor for stage I rectal cancer patients.

  8. A model combining spectrum standardization and dominant factor based partial least square method for carbon analysis in coal by laser-induced breakdown spectroscopy

    CERN Document Server

    Li, Xiongwei; Fu, Yangting; Li, Zheng; Ni, Weidou

    2014-01-01

    Successful quantitative measurement of carbon content in coal using laser-induced breakdown spectroscopy (LIBS) is suffered from relatively low precision and accuracy. In the present work, the spectrum standardization method was combined with the dominant factor based partial least square (PLS) method to improve the measurement accuracy of carbon content in coal by LIBS. The combination model employed the spectrum standardization method to convert the carbon line intensity into standard state for more accurately calculating the dominant carbon concentration, and then applied PLS with full spectrum information to correct the residual errors. The combination model was applied to the measurement of carbon content for 24 bituminous coal samples. The results demonstrated that the combination model could further improve the measurement accuracy compared with both our previously established spectrum standardization model and dominant factor based PLS model using spectral area normalized intensity for the dominant fa...

  9. A Second Generation Nonlinear Factor Analysis.

    Science.gov (United States)

    Etezadi-Amoli, Jamshid; McDonald, Roderick P.

    1983-01-01

    Nonlinear common factor models with polynomial regression functions, including interaction terms, are fitted by simultaneously estimating the factor loadings and common factor scores, using maximum likelihood and least squares methods. A Monte Carlo study gives support to a conjecture about the form of the distribution of the likelihood ratio…

  10. Kernel Factor Analysis Algorithm with Varimax

    Institute of Scientific and Technical Information of China (English)

    Xia Guoen; Jin Weidong; Zhang Gexiang

    2006-01-01

    Kernal factor analysis (KFA) with varimax was proposed by using Mercer kernel function which can map the data in the original space to a high-dimensional feature space, and was compared with the kernel principle component analysis (KPCA). The results show that the best error rate in handwritten digit recognition by kernel factor analysis with varimax (4.2%) was superior to KPCA (4.4%). The KFA with varimax could more accurately image handwritten digit recognition.

  11. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  12. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  13. Psychometric properties of the SDM-Q-9 questionnaire for shared decision-making in multiple sclerosis: item response theory modelling and confirmatory factor analysis.

    Science.gov (United States)

    Ballesteros, Javier; Moral, Ester; Brieva, Luis; Ruiz-Beato, Elena; Prefasi, Daniel; Maurino, Jorge

    2017-04-22

    Shared decision-making is a cornerstone of patient-centred care. The 9-item Shared Decision-Making Questionnaire (SDM-Q-9) is a brief self-assessment tool for measuring patients' perceived level of involvement in decision-making related to their own treatment and care. Information related to the psychometric properties of the SDM-Q-9 for multiple sclerosis (MS) patients is limited. The objective of this study was to assess the performance of the items composing the SDM-Q-9 and its dimensional structure in patients with relapsing-remitting MS. A non-interventional, cross-sectional study in adult patients with relapsing-remitting MS was conducted in 17 MS units throughout Spain. A nonparametric item response theory (IRT) analysis was used to assess the latent construct and dimensional structure underlying the observed responses. A parametric IRT model, General Partial Credit Model, was fitted to obtain estimates of the relationship between the latent construct and item characteristics. The unidimensionality of the SDM-Q-9 instrument was assessed by confirmatory factor analysis. A total of 221 patients were studied (mean age = 42.1 ± 9.9 years, 68.3% female). Median Expanded Disability Status Scale score was 2.5 ± 1.5. Most patients reported taking part in each step of the decision-making process. Internal reliability of the instrument was high (Cronbach's α = 0.91) and the overall scale scalability score was 0.57, indicative of a strong scale. All items, except for the item 1, showed scalability indices higher than 0.30. Four items (items 6 through to 9) conveyed more than half of the SDM-Q-9 overall information (67.3%). The SDM-Q-9 was a good fit for a unidimensional latent structure (comparative fit index = 0.98, root-mean-square error of approximation = 0.07). All freely estimated parameters were statistically significant (P 0.40) with the exception of item 1 which presented the lowest loading (0.26). Items 6 through to 8 were the

  14. Confirmatory Factor Analysis of the Procrastination Assessment Scale for Students

    Directory of Open Access Journals (Sweden)

    Ronald D. Yockey

    2015-10-01

    Full Text Available The relative fit of one- and two-factor models of the Procrastination Assessment Scale for Students (PASS was investigated using confirmatory factor analysis on an ethnically diverse sample of 345 participants. The results indicated that although the two-factor model provided better fit to the data than the one-factor model, neither model provided optimal fit. However, a two-factor model which accounted for common item theme pairs used by Solomon and Rothblum in the creation of the scale provided good fit to the data. In addition, a significant difference by ethnicity was also found on the fear of failure subscale of the PASS, with Whites having significantly lower scores than Asian Americans or Latino/as. Implications of the results are discussed and recommendations made for future work with the scale.

  15. EFFECTS OF CONTINUOUS STEM-CELL FACTOR ADMINISTRATION ON NORMAL AND ERYTHROPOIETIN-STIMULATED MURINE HEMATOPOIESIS - EXPERIMENTAL RESULTS AND MODEL ANALYSIS

    NARCIS (Netherlands)

    DEHAAN, G; DONTJE, B; NIJHOF, W; LOEFFLER, M

    1995-01-01

    The aim of this study was to determine how stem cell factor (SCF) modifies hemopoietic cell production. First we determined the effects of a prolonged SCP administration on murine hemopoiesis and analyzed the results by a mathematical simulation model of hemopoiesis in order to explain the data. Sub

  16. Local versus systemic anti-tumour necrosis factor-α effects of adalimumab in rheumatoid arthritis: pharmacokinetic modelling analysis of interaction between a soluble target and a drug.

    Science.gov (United States)

    Stepensky, David

    2012-07-01

    The pharmacokinetic models that are applied to describe the disposition of therapeutic antibodies assume that the interaction between an antibody and its target takes place in the central compartment. However, an increasing number of therapeutic antibodies are directed towards soluble/mobile targets. A flawed conclusion can be reached if the pharmacokinetic and pharmacodynamic analysis assumes that the interaction between the therapeutic antibody and its target takes place in the central compartment. The objective of this study was to assess the relative importance of local versus systemic interactions between adalimumab and tumour necrosis factor (TNF)-α in rheumatoid arthritis (RA), identify localization of the site of adalimumab action and assess the efficacy of local (intra-articular) versus systemic adalimumab administration for treatment of RA. The clinical and preclinical data on adalimumab and TNFα disposition were analysed using a pharmacokinetic modelling and simulation approach. The disposition of adalimumab and TNFα and the interaction between them at the individual compartments (the synovial fluid of the affected joints, central and peripheral compartments) following different routes of adalimumab administration were studied. Outcomes of modelling and simulation using the pharmacokinetic model developed indicate that adalimumab can efficiently permeate from the diseased joints to the central circulation in RA patients. Permeability of TNFα, which is excessively secreted in the joints, is even higher than that of adalimumab. As a result, subcutaneous, intravenous and intra-articular administration of the clinically used dose of adalimumab (40 mg) exert similar effects on the time course of TNFα concentrations at different locations in the body and efficiently deplete the TNFα in all of the compartments for a prolonged period of time (8-10 weeks). At this dose, adalimumab exhibits predominantly systemic anti-TNFα effects at the central and

  17. The Factor Structure of the Values in Action Inventory of Strengths (VIA-IS): An Item-Level Exploratory Structural Equation Modeling (ESEM) Bifactor Analysis.

    Science.gov (United States)

    Ng, Vincent; Cao, Mengyang; Marsh, Herbert W; Tay, Louis; Seligman, Martin E P

    2016-10-13

    The factor structure of the Values in Action Inventory of Strengths (VIA-IS; Peterson & Seligman, 2004) has not been well established as a result of methodological challenges primarily attributable to a global positivity factor, item cross-loading across character strengths, and questions concerning the unidimensionality of the scales assessing character strengths. We sought to overcome these methodological challenges by applying exploratory structural equation modeling (ESEM) at the item level using a bifactor analytic approach to a large sample of 447,573 participants who completed the VIA-IS with all 240 character strengths items and a reduced set of 107 unidimensional character strength items. It was found that a 6-factor bifactor structure generally held for the reduced set of unidimensional character strength items; these dimensions were justice, temperance, courage, wisdom, transcendence, humanity, and an overarching general factor that is best described as dispositional positivity. (PsycINFO Database Record

  18. Correspondence factor analysis of steroid libraries.

    Science.gov (United States)

    Ojasoo, T; Raynaud, J P; Doré, J C

    1995-06-01

    The receptor binding of a library of 187 steroids to five steroid hormone receptors (estrogen, progestin, androgen, mineralocorticoid, and glucocorticoid) has been analyzed by correspondence factor analysis (CFA) in order to illustrate how the method could be used to derive structure-activity-relationships from much larger libraries. CFA is a cartographic multivariate technique that provides objective distribution maps of the data after reduction and filtering of redundant information and noise. The key to the analysis of very complex data tables is the formation of barycenters (steroids with one or more common structural fragments) that can be introduced into CFA analyses used as mathematical models. This is possible in CFA because the method uses X2-metrics and is based on the distributional equivalence of the rows and columns of the transformed data matrix. We have thus demonstrated, in purely objective statistical terms, the general conclusions on the specificity of various functional and other groups derived from prior analyses by expert intuition and reasoning. A finer analysis was made of a series of A-ring phenols showing the high degree of glucocorticoid receptor and progesterone receptor binding that can be generated by certain C-11-substitutions despite the presence of the phenolic A-ring characteristic of estrogen receptor-specific binding.

  19. Comparison of Transcription Factor Binding Site Models

    KAUST Repository

    Bhuyan, Sharifulislam

    2012-05-01

    Modeling of transcription factor binding sites (TFBSs) and TFBS prediction on genomic sequences are important steps to elucidate transcription regulatory mechanism. Dependency of transcription regulation on a great number of factors such as chemical specificity, molecular structure, genomic and epigenetic characteristics, long distance interaction, makes this a challenging problem. Different experimental procedures generate evidence that DNA-binding domains of transcription factors show considerable DNA sequence specificity. Probabilistic modeling of TFBSs has been moderately successful in identifying patterns from a family of sequences. In this study, we compare performances of different probabilistic models and try to estimate their efficacy over experimental TFBSs data. We build a pipeline to calculate sensitivity and specificity from aligned TFBS sequences for several probabilistic models, such as Markov chains, hidden Markov models, Bayesian networks. Our work, containing relevant statistics and evaluation for the models, can help researchers to choose the most appropriate model for the problem at hand.

  20. System Identification by Dynamic Factor Models

    NARCIS (Netherlands)

    C. Heij (Christiaan); W. Scherrer; M. Destler

    1996-01-01

    textabstractThis paper concerns the modelling of stochastic processes by means of dynamic factor models. In such models the observed process is decomposed into a structured part called the latent process, and a remainder that is called noise. The observed variables are treated in a symmetric way, so

  1. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  2. Human factors engineering program review model

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The staff of the Nuclear Regulatory Commission is performing nuclear power plant design certification reviews based on a design process plan that describes the human factors engineering (HFE) program elements that are necessary and sufficient to develop an acceptable detailed design specification and an acceptable implemented design. There are two principal reasons for this approach. First, the initial design certification applications submitted for staff review did not include detailed design information. Second, since human performance literature and industry experiences have shown that many significant human factors issues arise early in the design process, review of the design process activities and results is important to the evaluation of an overall design. However, current regulations and guidance documents do not address the criteria for design process review. Therefore, the HFE Program Review Model (HFE PRM) was developed as a basis for performing design certification reviews that include design process evaluations as well as review of the final design. A central tenet of the HFE PRM is that the HFE aspects of the plant should be developed, designed, and evaluated on the basis of a structured top-down system analysis using accepted HFE principles. The HFE PRM consists of ten component elements. Each element in divided into four sections: Background, Objective, Applicant Submittals, and Review Criteria. This report describes the development of the HFE PRM and gives a detailed description of each HFE review element.

  3. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  4. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  5. Investigating product development strategy in beverage industry using factor analysis

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available Selecting a product development strategy that is associated with the company's current service or product innovation, based on customers’ needs and changing environment, plays an important role in increasing demand, increasing market share, increasing sales and profits. Therefore, it is important to extract effective variables associated with product development to improve performance measurement of firms. This paper investigates important factors influencing product development strategies using factor analysis. The proposed model of this paper investigates 36 factors and, using factor analysis, we extract six most influential factors including information sharing, intelligence information, exposure strategy, differentiation, research and development strategy and market survey. The first strategy, partnership, includes five sub-factor including product development partnership, partnership with foreign firms, customers’ perception from competitors’ products, Customer involvement in product development, inter-agency coordination, customer-oriented approach to innovation and transmission of product development change where inter-agency coordination has been considered the most important factor. Internal strengths are the most influential factors impacting the second strategy, intelligence information. The third factor, introducing strategy, introducing strategy, includes four sub criteria and consumer buying behavior is the most influencing factor. Differentiation is the next important factor with five components where knowledge and expertise in product innovation is the most important one. Research and development strategy with four sub-criteria where reducing product development cycle plays the most influential factor and finally, market survey strategy is the last important factor with three factors and finding new market plays the most important role.

  6. Analysis of regional total factor energy efficiency in China under environmental constraints: based on undesirable-minds and DEA window model

    Science.gov (United States)

    Zhang, Shuying; Li, Deshan; Li, Shuangqiang; Jiang, Hanyu; Shen, Yuqing

    2017-06-01

    With China’s entrance into the new economy, the improvement of energy efficiency has become an important indicator to measure the quality of ecological civilization construction and economic development. According to the panel data of Chinese regions in 1996-2014, the nearest distance to the efficient frontier of Undesirable-MinDS Xeon model and DEA window model have been used to calculate the total factor energy efficiency of China’s regions. Study found that: Under environmental constraints, China’s total factor energy efficiency has increased after the first drop in the overall 1996-2014, and then increases again. And the difference between the regions is very large, showing a characteristic of “the east is the highest, the west is lower, and lowest is in the central” finally, this paper puts forward relevant policy suggestions.

  7. A risk-benefit analysis of factor V Leiden testing to improve pregnancy outcomes: a case study of the capabilities of decision modeling in genomics.

    Science.gov (United States)

    Bajaj, Preeti S; Veenstra, David L

    2013-05-01

    We sought to assess the benefits, risks, and personal utility of factor V Leiden mutation testing to improve pregnancy outcomes and to assess the utility of decision-analytic modeling for complex outcomes in genomics. We developed a model to evaluate factor V Leiden testing among women with a history of recurrent pregnancy loss, including heparin therapy during pregnancy in mutation-positive women. Outcomes included venous thromboembolism, major bleeds, pregnancy loss, maternal mortality, and quality-adjusted life-years. Factor V Leiden testing in a hypothetical cohort of 10,000 women led to 7 fewer venous thromboembolic events, 90 fewer pregnancy losses, and an increase of 17 major bleeding events. Small improvements in quality-adjusted life-years were largely attributable to reduced mortality but also to improvements in health-related quality of life. However, sensitivity analyses indicate large variance in results due to data uncertainty. Furthermore, the complexity of outcomes limited our ability to fully capture the repercussions of testing in the quality-adjusted life-year measure. Factor V Leiden testing involves tradeoffs between clinical and personal utility, and additional effectiveness data are needed for heparin use to prevent pregnancy loss. Decision-analytic methods offer somewhat limited value in assessing these tradeoffs, suggesting that evaluation of complex outcomes will require novel approaches to appropriately capture patient-centered outcomes.Genet Med 2013:15(5):374-381.

  8. Modelling Public Security Operations: Analysis of the Effect of Key Social, Cognitive, and Informational Factors with Security System Relationship Configurations for Goal Achievement

    Science.gov (United States)

    2012-12-01

    abstracts in both official languages unless the text is bilingual .) More so than engineered systems, human factors, and specifically having humans-in...response to a harbour fire; the second , [3], presented social, cognitive, and informational models that could be used to extend the initial simulator...Override Sharing Events The second chart type shows the total number of sharing events that take place in the system. These come in two flavours: the

  9. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...

  10. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the

  11. Continuous utility factor in segregation models

    Science.gov (United States)

    Roy, Parna; Sen, Parongama

    2016-02-01

    We consider the constrained Schelling model of social segregation in which the utility factor of agents strictly increases and nonlocal jumps of the agents are allowed. In the present study, the utility factor u is defined in a way such that it can take continuous values and depends on the tolerance threshold as well as the fraction of unlike neighbors. Two models are proposed: in model A the jump probability is determined by the sign of u only, which makes it equivalent to the discrete model. In model B the actual values of u are considered. Model A and model B are shown to differ drastically as far as segregation behavior and phase transitions are concerned. In model A, although segregation can be achieved, the cluster sizes are rather small. Also, a frozen state is obtained in which steady states comprise many unsatisfied agents. In model B, segregated states with much larger cluster sizes are obtained. The correlation function is calculated to show quantitatively that larger clusters occur in model B. Moreover for model B, no frozen states exist even for very low dilution and small tolerance parameter. This is in contrast to the unconstrained discrete model considered earlier where agents can move even when utility remains the same. In addition, we also consider a few other dynamical aspects which have not been studied in segregation models earlier.

  12. Reliability assessment of offshore platforms exposed to wave-in-deck loading. Appendix F: Reliability analysis of offshore jacket structures with wave load on deck using the model correction factor method

    Energy Technology Data Exchange (ETDEWEB)

    Dalsgaard Soerensen, J. [Aalborg Univ., Aalborg (Denmark); Friis-Hansen, P. [Technical Univ. Denmark, Lyngby (Denmark); Bloch, A.; Svejgaard Nielsen, J. [Ramboell, Esbjerg (Denmark)

    2004-08-01

    Different simple stochastic models for failure related to pushover collapse are investigated. Next, a method is proposed to estimate the reliability of real offshore jacket structures. The method is based on the Model Correction Factor Method and can be used to very efficiently to estimate the reliability for total failure/collapse of jacket type platforms with wave in deck loads. A realistic example is evaluated and it is seen that it is possible to perform probabilistic reliability analysis for collapse of a jacket type platform using the model correction factor method. The total number of deterministic, complicated, non-linear (RONJA) analysis is typically as low as 10. Such reliability analyses are recommended to be used in practical applications, especially for cases with wave in deck load, where the traditional RSR analyses give poor measures of the structural reliability. (au)

  13. Advances in Behavioral Genetics Modeling Using Mplus: Applications of Factor Mixture Modeling to Twin Data

    National Research Council Canada - National Science Library

    Muthen, Bengt; Asparouhov, Tihomir; Rebollo, Irene

    2006-01-01

    This article discusses new latent variable techniques developed by the authors. As an illustration, a new factor mixture model is applied to the monozygotic-dizygotic twin analysis of binary items measuring alcohol-use disorder...

  14. Meta analysis of risk factors for colorectal cancer

    Institute of Scientific and Technical Information of China (English)

    Kun Chen; Jiong-Liang Qiu; Yang Zhang; Yu-Wan Zhao

    2003-01-01

    AIM: To study the risk factors for colorectal cancer in China.METHODS: A meta-analysis of the risk factors of colorectal cancer was conducted for 14 case-control studies, and reviewed 14 reports within 13 years which included 5034cases and 5205 controls. Dersimonian and Laird random effective models were used to process the results.RESULTS: Meta analysis of the 14 studies demonstrated that proper physical activites and dietary fibers were protective factors (pooled OR<0.8), while fecal mucohemorrhage,chronic diarrhea and polyposis were highly associated with colorectal cancer (all pooled OR>4). The stratified results showed that different OR values of some factors were due to geographic factors or different resourses.CONCLUSION: Risks of colorectal cancer are significantly associated with the histories of intestinal diseases or relative symptoms, high lipid diet, emotional trauma and family history of cancers. The suitable physical activities and dietary fibers are protective factors.

  15. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  16. Analysis of rainfall-triggered landslide hazards through the dynamic integration of remotely sensed, modeled and in situ environmental factors in El Salvador

    Science.gov (United States)

    Anderson, Eric Ross

    Landslides pose a persistent threat to El Salvador's population, economy and environment. Government officials share responsibility in managing this hazard by alerting populations when and where landslides may occur as well as developing and enforcing proper land use and zoning practices. This thesis addresses gaps in current knowledge between identifying precisely when and where slope failures may initiate and outlining the extent of the potential debris inundation areas. Improvements on hazard maps are achieved by considering a series of environmental variables to determine causal factors through spatial and temporal analysis techniques in Geographic Information Systems and remote sensing. The output is a more dynamic tool that links high resolution geomorphic and hydrological factors to daily precipitation. Directly incorporable into existing decision support systems, this allows for better disaster management and is transferable to other developing countries.

  17. The asset pricing model of musharakah factors

    Science.gov (United States)

    Simon, Shahril; Omar, Mohd; Lazam, Norazliani Md

    2015-02-01

    The existing three-factor model developed by Fama and French for conventional investment was formulated based on risk-free rates element in which contradict with Shariah principles. We note that the underlying principles that govern Shariah investment were mutual risk and profit sharing between parties, the assurance of fairness for all and that transactions were based on an underlying asset. In addition, the three-factor model did not exclude stock that was not permissible by Shariah such as financial services based on riba (interest), gambling operator, manufacture or sale of non-halal products or related products and other activities deemed non-permissible according to Shariah. Our approach to construct the factor model for Shariah investment was based on the basic tenets of musharakah in tabulating the factors. We start by noting that Islamic stocks with similar characteristics should have similar returns and risks. This similarity between Islamic stocks was defined by the similarity of musharakah attributes such as business, management, profitability and capital. These attributes define factor exposures (or betas) to factors. The main takeaways were that musharakah attributes we chose had explain stock returns well in cross section and were significant in different market environments. The management factor seemed to be responsible for the general dynamics of the explanatory power.

  18. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  19. Factor Copula Models for Replicated Spatial Data

    KAUST Repository

    Krupskii, Pavel

    2016-12-19

    We propose a new copula model that can be used with replicated spatial data. Unlike the multivariate normal copula, the proposed copula is based on the assumption that a common factor exists and affects the joint dependence of all measurements of the process. Moreover, the proposed copula can model tail dependence and tail asymmetry. The model is parameterized in terms of a covariance function that may be chosen from the many models proposed in the literature, such as the Matérn model. For some choice of common factors, the joint copula density is given in closed form and therefore likelihood estimation is very fast. In the general case, one-dimensional numerical integration is needed to calculate the likelihood, but estimation is still reasonably fast even with large data sets. We use simulation studies to show the wide range of dependence structures that can be generated by the proposed model with different choices of common factors. We apply the proposed model to spatial temperature data and compare its performance with some popular geostatistics models.

  20. SDI CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-05

    The Savannah River Remediation (SRR) Organization requested that Savannah River National Laboratory (SRNL) develop a Computational Fluid Dynamics (CFD) method to mix and blend the miscible contents of the blend tanks to ensure the contents are properly blended before they are transferred from the blend tank; such as, Tank 50H, to the Salt Waste Processing Facility (SWPF) feed tank. The work described here consists of two modeling areas. They are the mixing modeling analysis during miscible liquid blending operation, and the flow pattern analysis during transfer operation of the blended liquid. The transient CFD governing equations consisting of three momentum equations, one mass balance, two turbulence transport equations for kinetic energy and dissipation rate, and one species transport were solved by an iterative technique until the species concentrations of tank fluid were in equilibrium. The steady-state flow solutions for the entire tank fluid were used for flow pattern analysis, for velocity scaling analysis, and the initial conditions for transient blending calculations. A series of the modeling calculations were performed to estimate the blending times for various jet flow conditions, and to investigate the impact of the cooling coils on the blending time of the tank contents. The modeling results were benchmarked against the pilot scale test results. All of the flow and mixing models were performed with the nozzles installed at the mid-elevation, and parallel to the tank wall. From the CFD modeling calculations, the main results are summarized as follows: (1) The benchmark analyses for the CFD flow velocity and blending models demonstrate their consistency with Engineering Development Laboratory (EDL) and literature test results in terms of local velocity measurements and experimental observations. Thus, an application of the established criterion to SRS full scale tank will provide a better, physically-based estimate of the required mixing time, and

  1. Confirmatory factor analysis of the WAIS-IV/WMS-IV.

    Science.gov (United States)

    Holdnack, James A; Xiaobin Zhou; Larrabee, Glenn J; Millis, Scott R; Salthouse, Timothy A

    2011-06-01

    The Wechsler Adult Intelligence Scale-fourth edition (WAIS-IV) and the Wechsler Memory Scale-fourth edition (WMS-IV) were co-developed to be used individually or as a combined battery of tests. The independent factor structure of each of the tests has been identified; however, the combined factor structure has yet to be determined. Confirmatory factor analysis was applied to the WAIS-IV/WMS-IV Adult battery (i.e., age 16-69 years) co-norming sample (n = 900) to test 13 measurement models. The results indicated that two models fit the data equally well. One model is a seven-factor solution without a hierarchical general ability factor: Verbal Comprehension, Perceptual Reasoning, Processing Speed, Auditory Working Memory, Visual Working Memory, Auditory Memory, and Visual Memory. The second model is a five-factor model composed of Verbal Comprehension, Perceptual Reasoning, Processing Speed, Working Memory, and Memory with a hierarchical general ability factor. Interpretative implications for each model are discussed.

  2. Factor Structure and Reliability of the Revised Conflict Tactics Scales' (CTS2) 10-Factor Model in a Community-Based Female Sample

    Science.gov (United States)

    Yun, Sung Hyun

    2011-01-01

    The present study investigated the factor structure and reliability of the revised Conflict Tactics Scales' (CTS2) 10-factor model in a community-based female sample (N = 261). The underlying factor structure of the 10-factor model was tested by the confirmatory multiple group factor analysis, which demonstrated complex factor cross-loadings…

  3. Comparative factor analysis models for an empirical study of EEG data, II: A data-guided resolution of the rotation indeterminacy.

    Science.gov (United States)

    Rogers, L J; Douglas, R R

    1984-02-01

    In this paper (the second in a series), we consider a (generic) pair of datasets, which have been analyzed by the techniques of the previous paper. Thus, their "stable subspaces" have been established by comparative factor analysis. The pair of datasets must satisfy two confirmable conditions. The first is the "Inclusion Condition," which requires that the stable subspace of one of the datasets is nearly identical to a subspace of the other dataset's stable subspace. On the basis of that, we have assumed the pair to have similar generating signals, with stochastically independent generators. The second verifiable condition is that the (presumed same) generating signals have distinct ratios of variances for the two datasets. Under these conditions a small elaboration of some elementary linear algebra reduces the rotation problem to several eigenvalue-eigenvector problems. Finally, we emphasize that an analysis of each dataset by the method of Douglas and Rogers (1983) is an essential prerequisite for the useful application of the techniques in this paper. Nonempirical methods of estimating the number of factors simply will not suffice, as confirmed by simulations reported in the previous paper.

  4. Determination of effective loss factors in reduced SEA models

    Science.gov (United States)

    Chimeno Manguán, M.; Fernández de las Heras, M. J.; Roibás Millán, E.; Simón Hidalgo, F.

    2017-01-01

    The definition of Statistical Energy Analysis (SEA) models for large complex structures is highly conditioned by the classification of the structure elements into a set of coupled subsystems and the subsequent determination of the loss factors representing both the internal damping and the coupling between subsystems. The accurate definition of the complete system can lead to excessively large models as the size and complexity increases. This fact can also rise practical issues for the experimental determination of the loss factors. This work presents a formulation of reduced SEA models for incomplete systems defined by a set of effective loss factors. This reduced SEA model provides a feasible number of subsystems for the application of the Power Injection Method (PIM). For structures of high complexity, their components accessibility can be restricted, for instance internal equipments or panels. For these cases the use of PIM to carry out an experimental SEA analysis is not possible. New methods are presented for this case in combination with the reduced SEA models. These methods allow defining some of the model loss factors that could not be obtained through PIM. The methods are validated with a numerical analysis case and they are also applied to an actual spacecraft structure with accessibility restrictions: a solar wing in folded configuration.

  5. Impact of festival factor on electric quantity multiplication forecast model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This research aims to improve the forecasting precision of electric quantity. It is discovered that the total electricity consumption considerably increased during the Spring Festival by the analysis of the electric quantity time series from 2002 to 2007 in Shandong province. The festival factor is ascertained to be one of the important seasonal factors affecting the electric quantity fluctuations, and the multiplication model for forecasting is improved by introducing corresponding variables and parameters...

  6. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  7. Determining Dimensionality of Exercise Readiness Using Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Kelley Strohacker, Rebecca A. Zakrajsek

    2016-06-01

    Full Text Available Assessment of “exercise readiness” is a central component to the flexible non-linear periodization (FNLP method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1 generation of item pool (n = 290, 2 assessment of face validity and refinement of item pool (n = 168, and 3 exploratory factor analysis (n = 684. A principal axis factor analysis was conducted with 41 items using oblique rotation (promax. Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived. Factor 2 items related to physical fatigue (e.g. tired, drained. Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick and health (i.e. healthy, fit, respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness.

  8. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  9. Spatial analysis of Tuberculosis in Rio de Janeiro in the period from 2005 to 2008 and associated socioeconomic factors using micro data and global spatial regression models.

    Science.gov (United States)

    Magalhães, Monica de Avelar Figueiredo Mafra; Medronho, Roberto de Andrade

    2017-03-01

    The present study analyses the spatial pattern of tuberculosis (TB) from 2005 to 2008 by identifying relevant socioeconomic variables for the occurrence of the disease through spatial statistical models. This ecological study was performed in Rio de Janeiro using new cases. The census sector was used as the unit of analysis. Incidence rates were calculated, and the Local Empirical Bayesian method was used. The spatial autocorrelation was verified with Moran's Index and local indicators of spatial association (LISA). Using Spearman's test, variables with significant correlation at 5% were used in the models. In the classic multivariate regression model, the variables that fitted better to the model were proportion of head of family with an income between 1 and 2 minimum wages, proportion of illiterate people, proportion of households with people living alone and mean income of the head of family. These variables were inserted in the Spatial Lag and Spatial Error models, and the results were compared. The former exhibited the best parameters: R2 = 0.3215, Log-Likelihood = -9228, Akaike Information Criterion (AIC) = 18,468 and Schwarz Bayesian Criterion (SBC) = 18,512. The statistical methods were effective in the identification of spatial patterns and in the definition of determinants of the disease providing a view of the heterogeneity in space, allowing actions aimed more at specific populations.

  10. Personality and coping traits: A joint factor analysis.

    Science.gov (United States)

    Ferguson, Eamonn

    2001-11-01

    OBJECTIVES: The main objective of this paper is to explore the structural similarities between Eysenck's model of personality and the dimensions of the dispositional COPE. Costa et al. {Costa P., Somerfield, M., & McCrae, R. (1996). Personality and coping: A reconceptualisation. In (pp. 44-61) Handbook of coping: Theory, research and applications. New York: Wiley} suggest that personality and coping behaviour are part of a continuum based on adaptation. If this is the case, there should be structural similarities between measures of personality and coping behaviour. This is tested using a joint factor analysis of personality and coping measures. DESIGN: Cross-sectional survey. METHODS: The EPQ-R and the dispositional COPE were administered to 154 participants, and the data were analysed using joint factor analysis and bivariate associations. RESULTS: The joint factor analysis indicated that these data were best explained by a four-factor model. One factor was primarily unrelated to personality. There was a COPE-neurotic-introvert factor (NI-COPE) containing coping behaviours such as denial, a COPE-extroversion (E-COPE) factor containing behaviours such as seeking social support and a COPE-psychoticism factor (P-COPE) containing behaviours such as alcohol use. This factor pattern, especially for NI- and E-COPE, was interpreted in terms of Gray's model of personality {Gray, J. A. (1987) The psychology of fear and stress. Cambridge: Cambridge University Press}. NI-, E-, and P-COPE were shown to be related, in a theoretically consistent manner, to perceived coping success and perceived coping functions. CONCLUSIONS: The results indicate that there are indeed conceptual links between models of personality and coping. It is argued that future research should focus on identifying coping 'trait complexes'. Implications for practice are discussed.

  11. Modelling non-normal data : The relationship between the skew-normal factor model and the quadratic factor model

    NARCIS (Netherlands)

    Smits, Iris A.M.; Timmerman, Marieke E.; Stegeman, Alwin

    Maximum likelihood estimation of the linear factor model for continuous items assumes normally distributed item scores. We consider deviations from normality by means of a skew-normally distributed factor model or a quadratic factor model. We show that the item distributions under a skew-normal

  12. Study of the factors affecting the karst volume assessment in the Dead Sea sinkhole problem using microgravity field analysis and 3-D modeling

    Directory of Open Access Journals (Sweden)

    L. V. Eppelbaum

    2008-11-01

    Full Text Available Thousands of sinkholes have appeared in the Dead Sea (DS coastal area in Israel and Jordan during two last decades. The sinkhole development is recently associated with the buried evaporation karst at the depth of 25–50 m from earth's surface caused by the drop of the DS level at the rate of 0.8–1.0 m/yr. Drop in the Dead Sea level has changed hydrogeological conditions in the subsurface and caused surface to collapse. The pre-existing cavern was detected using microgravity mapping in the Nahal Hever South site where seven sinkholes of 1–2 m diameter had been opened. About 5000 gravity stations were observed in the area of 200×200 m2 by the use of Scintrex CG-3M AutoGrav gravimeter. Besides the conventional set of corrections applied in microgravity investigations, a correction for a strong gravity horizontal gradient (DS Transform Zone negative gravity anomaly influence was inserted. As a result, residual gravity anomaly of –(0.08÷0.14 mGal was revealed. The gravity field analysis was supported by resistivity measurements. We applied the Emigma 7.8 gravity software to create the 3-D physical-geological models of the sinkholes development area. The modeling was confirmed by application of the GSFC program developed especially for 3-D combined gravity-magnetic modeling in complicated environments. Computed numerous gravity models verified an effective applicability of the microgravity technology for detection of karst cavities and estimation of their physical-geological parameters. A volume of the karst was approximately estimated as 35 000 m3. The visual analysis of large sinkhole clusters have been forming at the microgravity anomaly site, confirmed the results of microgravity mapping and 3-D modeling.

  13. What Is Rotating in Exploratory Factor Analysis?

    Science.gov (United States)

    Osborne, Jason W.

    2015-01-01

    Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…

  14. Multilevel exploratory factor analysis of discrete data

    NARCIS (Netherlands)

    Barendse, M.T.; Oort, F.J.; Jak, S.; Timmerman, M.E.

    2013-01-01

    Exploratory factor analysis (EFA) can be used to determine the dimensionality of a set of items. When data come from clustered subjects, such as pupils within schools or children within families, the hierarchical structure of the data should be taken into account. Standard multilevel EFA is only sui

  15. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  16. 老年性痴呆危险因素的Iogistic回归分析%An Analysis of Risk Factors for Senile Dementia Using Logistic Regression Model

    Institute of Scientific and Technical Information of China (English)

    吴伟; 陈骏; 周达生

    2000-01-01

    目的:探讨老年性痴呆的危险因素,为制订防范对策提供依据。方法:抽样调查南京市158名老年人健康状况及其影响因素,填写自我健康评估表、心理创伤量表、精神健康量表、幸福度量表、生活照料情况表、无望量表、老年性痴呆筛查量表等。资料用EPI和SAS系统作一元与多元分析。结果:一元分析结果显示,年龄、性别、职业、婚姻、教育、中风史6个因子与老年性痴呆发生率均有密切关系;多元分析结果显示,年龄越高,老年性痴呆发生率越高,女性发生率高于男性,体力劳动者发生率高于脑力劳动者。结论:老年性痴呆与社会心理因素有一定关系。%Objective The purpose of the investigation was to study the risk factors of senile dementia and to provide a scientific basis for making strategy of senile dementia. Methods A sampling survey of 158 old people(male 73,female 85) for their physical condition and it' s influencing factors have been done in Nanjing. The indices adopted included health appraisal of oneseff, psychological trauma,rnental health,happiness degree,ability of taking care of oneself, Bech-H table,senile dementia screening scale table. Epi Info and SAS software were used for univariate and multivariate analysis. Resuts Univariate analysis shows that the indices of age, sex, occupation, massiage, education and cerbgral haemorrhage history are closely related 1o the incidence of senile dementia. From multivariate analysis, we can see the older the age and the lower the culture level can lead to the higher incidence of senile dementia. Conclusion The senile dementia is related to social psychological factors.

  17. Aging Successfully: A Four-Factor Model

    Science.gov (United States)

    Lee, Pai-Lin; Lan, William; Yen, Tung-Wen

    2011-01-01

    The study was designed to validate a model for a successful aging process and examine the gender differences in the aging process. Three hundred twelve participants who were 65 or older completed a Taiwan Social Change Survey that measures four factors that define successful aging process: including physical, psychological, social support, and…

  18. Aging Successfully: A Four-Factor Model

    Science.gov (United States)

    Lee, Pai-Lin; Lan, William; Yen, Tung-Wen

    2011-01-01

    The study was designed to validate a model for a successful aging process and examine the gender differences in the aging process. Three hundred twelve participants who were 65 or older completed a Taiwan Social Change Survey that measures four factors that define successful aging process: including physical, psychological, social support, and…

  19. A hierarchical model for ordinal matrix factorization

    DEFF Research Database (Denmark)

    Paquet, Ulrich; Thomson, Blaise; Winther, Ole

    2012-01-01

    their ratings for other movies. The Netflix data set is used for evaluation, which consists of around 100 million ratings. Using root mean-squared error (RMSE) as an evaluation metric, results show that the suggested model outperforms alternative factorization techniques. Results also show how Gibbs sampling...

  20. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  1. The streptomycin mouse model for Salmonella diarrhea: functional analysis of the microbiota, the pathogen's virulence factors, and the host's mucosal immune response.

    Science.gov (United States)

    Kaiser, Patrick; Diard, Médéric; Stecher, Bärbel; Hardt, Wolf-Dietrich

    2012-01-01

    The mammalian intestine is colonized by a dense microbial community, the microbiota. Homeostatic and symbiotic interactions facilitate the peaceful co-existence between the microbiota and the host, and inhibit colonization by most incoming pathogens ('colonization resistance'). However, if pathogenic intruders overcome colonization resistance, a fierce, innate inflammatory defense can be mounted within hours, the adaptive arm of the immune system is initiated, and the pathogen is fought back. The molecular nature of the homeostatic interactions, the pathogen's ability to overcome colonization resistance, and the triggering of native and adaptive mucosal immune responses are still poorly understood. To study these mechanisms, the streptomycin mouse model for Salmonella diarrhea is of great value. Here, we review how S. Typhimurium triggers mucosal immune responses by active (virulence factor elicited) and passive (MyD88-dependent) mechanisms and introduce the S. Typhimurium mutants available for focusing on either response. Interestingly, mucosal defense turns out to be a double-edged sword, limiting pathogen burdens in the gut tissue but enhancing pathogen growth in the gut lumen. This model allows not only studying the molecular pathogenesis of Salmonella diarrhea but also is ideally suited for analyzing innate defenses, microbe handling by mucosal phagocytes, adaptive secretory immunoglobulin A responses, probing microbiota function, and homeostatic microbiota-host interactions. Finally, we discuss the general need for defined assay conditions when using animal models for enteric infections and the central importance of littermate controls.

  2. Model Analysis ToolKit

    Energy Technology Data Exchange (ETDEWEB)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  3. Factor selection and structural identification in the interaction ANOVA model.

    Science.gov (United States)

    Post, Justin B; Bondell, Howard D

    2013-03-01

    When faced with categorical predictors and a continuous response, the objective of an analysis often consists of two tasks: finding which factors are important and determining which levels of the factors differ significantly from one another. Often times, these tasks are done separately using Analysis of Variance (ANOVA) followed by a post hoc hypothesis testing procedure such as Tukey's Honestly Significant Difference test. When interactions between factors are included in the model the collapsing of levels of a factor becomes a more difficult problem. When testing for differences between two levels of a factor, claiming no difference would refer not only to equality of main effects, but also to equality of each interaction involving those levels. This structure between the main effects and interactions in a model is similar to the idea of heredity used in regression models. This article introduces a new method for accomplishing both of the common analysis tasks simultaneously in an interaction model while also adhering to the heredity-type constraint on the model. An appropriate penalization is constructed that encourages levels of factors to collapse and entire factors to be set to zero. It is shown that the procedure has the oracle property implying that asymptotically it performs as well as if the exact structure were known beforehand. We also discuss the application to estimating interactions in the unreplicated case. Simulation studies show the procedure outperforms post hoc hypothesis testing procedures as well as similar methods that do not include a structural constraint. The method is also illustrated using a real data example.

  4. Factor Selection and Structural Identification in the Interaction ANOVA Model

    Science.gov (United States)

    Post, Justin B.; Bondell, Howard D.

    2013-01-01

    Summary When faced with categorical predictors and a continuous response, the objective of analysis often consists of two tasks: finding which factors are important and determining which levels of the factors differ significantly from one another. Often times these tasks are done separately using Analysis of Variance (ANOVA) followed by a post-hoc hypothesis testing procedure such as Tukey’s Honestly Significant Difference test. When interactions between factors are included in the model the collapsing of levels of a factor becomes a more difficult problem. When testing for differences between two levels of a factor, claiming no difference would refer not only to equality of main effects, but also equality of each interaction involving those levels. This structure between the main effects and interactions in a model is similar to the idea of heredity used in regression models. This paper introduces a new method for accomplishing both of the common analysis tasks simultaneously in an interaction model while also adhering to the heredity-type constraint on the model. An appropriate penalization is constructed that encourages levels of factors to collapse and entire factors to be set to zero. It is shown that the procedure has the oracle property implying that asymptotically it performs as well as if the exact structure were known beforehand. We also discuss the application to estimating interactions in the unreplicated case. Simulation studies show the procedure outperforms post hoc hypothesis testing procedures as well as similar methods that do not include a structural constraint. The method is also illustrated using a real data example. PMID:23323643

  5. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  6. 基于主成分回归模型的我国能源消费影响因素分析%Analysis of influential factors of energy consumption in china based on principal component regression model

    Institute of Scientific and Technical Information of China (English)

    赵建辉

    2014-01-01

    影响我国能源消费的因素有很多,这些因素之间存在较强的相关性。本文通过主成分分析法消除影响因素之间的多重共线性,提取影响能源消费的主要因素,然后与能源消费进行回归分析。研究表明经济发展、能源价格和产业结构是影响能源消费的主要因素,并且由此建立的回归模型具有较好的预测结果。%There are many factors affecting energy consumption in China .A strong correlation exists in these factors .This paper uses principal component analysis to eliminate multicollinearity of the factors and extracts the main factors affecting energy consumption ,and then uses regression model to study the relationship between energy consumption and the main factors .Studies show that economic development , energy prices and industrial structure are the main influence factors of energy consumption in China .The regression model has precise predicted results .

  7. Analysis of Factors Influencing the Growth of Export of Agricultural Products in Yunnan Province: An Empirical Study Based on CMS Model

    Institute of Scientific and Technical Information of China (English)

    Dongmei; LI; Jianxing; LU; Zhiwen; XIAO; Bo; LI

    2013-01-01

    Yunnan Province is the bridgehead for the opening of the southwest region in China,and the economic frontier of China-ASEAN economic cooperation,where the export of agricultural products plays an important role in promoting the openness of the southwest region and strengthening China-ASEAN economic cooperation.In this paper,we use CMS model to analyze causes of the increase in the exports of agricultural products in Yunnan Province during the period 2001-2010.It is found that the expanded import scale of agricultural products in the world,improvement in the export competitiveness of products and the adaption of product export structure to the changes in the structure of world import demand,are the main factors responsible for increase in the exports of agricultural products in Yunnan Province.In recent years, the improvement of competitiveness of the export of agricultural products becomes a key factor for the growth of the export of agricultural products in Yunnan Province.In terms of the products classified,strong export competitiveness has promoted the export of plant products,such as beverages and tobacco,while weak export competitiveness and unreasonable export structure has impeded the export of animal products,plant and animal oil and fat products.

  8. Structural modeling and mutational analysis of yeast eukaryotic translation initiation factor 5A reveal new critical residues and reinforce its involvement in protein synthesis

    Science.gov (United States)

    Dias, Camila A. O.; Cano, Veridiana S. P.; Rangel, Suzana M.; Apponi, Luciano H.; Frigieri, Mariana C.; Muniz, João R. C.; Garcia, Wanius; Park, Myung H.; Garratt, Richard C.; Zanelli, Cleslei F.; Valentini, Sandro R.

    2017-01-01

    Eukaryotic translation initiation factor 5A (eIF5A) is a protein that is highly conserved and essential for cell viability. This factor is the only protein known to contain the unique and essential amino acid residue hypusine. This work focused on the structural and functional characterization of Saccharomyces cerevisiae eIF5A. The tertiary structure of yeast eIF5A was modeled based on the structure of its Leishmania mexicana homologue and this model was used to predict the structural localization of new site-directed and randomly generated mutations. Most of the 40 new mutants exhibited phenotypes that resulted from eIF-5A protein-folding defects. Our data provided evidence that the C-terminal α-helix present in yeast eIF5A is an essential structural element, whereas the eIF5A N-terminal 10 amino acid extension not present in archaeal eIF5A homologs, is not. Moreover, the mutants containing substitutions at or in the vicinity of the hypusine modification site displayed nonviable or temperature-sensitive phenotypes and were defective in hypusine modification. Interestingly, two of the temperature-sensitive strains produced stable mutant eIF5A proteins – eIF5AK56A and eIF5AQ22H,L93F – and showed defects in protein synthesis at the restrictive temperature. Our data revealed important structural features of eIF5A that are required for its vital role in cell viability and underscored an essential function of eIF5A in the translation step of gene expression. PMID:18341589

  9. Model Checking as Static Analysis

    DEFF Research Database (Denmark)

    Zhang, Fuyuan

    Both model checking and static analysis are prominent approaches to detecting software errors. Model Checking is a successful formal method for verifying properties specified in temporal logics with respect to transition systems. Static analysis is also a powerful method for validating program...... properties which can predict safe approximations to program behaviors. In this thesis, we have developed several static analysis based techniques to solve model checking problems, aiming at showing the link between static analysis and model checking. We focus on logical approaches to static analysis......-calculus can be encoded as the intended model of SFP. Our research results have strengthened the link between model checking and static analysis. This provides a theoretical foundation for developing a unied tool for both model checking and static analysis techniques....

  10. [Factor models of the Beck Depression Inventory-II. Validation with coronary patients and a critique of Ward's model].

    Science.gov (United States)

    del Pino Pérez, Antonio; Ibáñez Fernández, Ignacio; Bosa Ojeda, Francisco; Dorta González, Ruth; Gaos Miezoso, María Teresa

    2012-02-01

    The objective of this study was to validate in a sample of 205 coronary patients a factor model for the BDI-II, especially a model that would allow for modeling of depressive symptoms after explicitly removing bias related to somatic symptoms of depression that would overlap those of heart disease. Exploratory and confirmatory factor analyses for ordinal data were conducted. A one-factor model, six correlated two-factor models and, derivatives thereof, seven models with a single General Depression factor and two uncorrelated factors, were analyzed. Exploratory analysis extracted two factors, Somatic-affective and Cognitive. Confirmatory factor analyses showed the worst fit for the one-factor model. Two-factor models were surpassed in goodness of fit by the models of general-factor and group factors. Among these, the General, Somatic-affective and Cognitive (G-Sa-C) model of Beck with students is noteworthy. The reduced General, Somatic and Cognitive (G-S-C) model of Ward showed the worst goodness of fit. Our model surpasses the cutoff criteria of all fit indexes. We conclude that the inclusion of a general-factor and group factors in all the models surpasses the results of G-S-C model and, therefore, questions it. The G-Sa-C model is strengthened.

  11. Unascertained Factor Method of Dynamic Characteristic Analysis for Antenna Structures

    Institute of Scientific and Technical Information of China (English)

    ZHU Zeng-qing; LIANG Zhen-tao; CHEN Jian-jun

    2008-01-01

    The dynamic characteristic analysis model of antenna structures is built, in which the structural physical parameters and geometrical dimensions are all considered as unascertained variables, And a structure dynamic characteristic analysis method based on the unascertained factor method is given. The computational expression of structural characteristic is developed by the mathematics expression of unascertained factor and the principles of unascertained rational numbers arithmetic. An example is given, in which the possible values and confidence degrees of the unascertained structure characteristics are obtained. The calculated results show that the method is feasible and effective.

  12. Recovery of weak factor loadings when adding the mean structure in confirmatory factor analysis: A simulation study

    Directory of Open Access Journals (Sweden)

    Carmen eXiménez

    2016-01-01

    Full Text Available This article extends previous research on the recovery of weak factor loadings in confirmatory factor analysis by exploring the effects of adding the mean structure. This issue has not been examined in previous research. This study is based on the framework of Yung and Bentler (1999 and aims to examine the conditions that affect the recovery of weak factor loadings when the model includes the mean structure, compared to analyzing the covariance structure alone. A simulation study was conducted in which several constraints were defined for one-, two-, and three-factor models. Results show that adding the mean structure improves the recovery of weak factor loadings and reduces the asymptotic variances for the factor loadings, particularly for the models with a smaller number of factors and a small sample size. Therefore, under certain circumstances, modeling the means should be seriously considered for covariance models containing weak factor loadings.

  13. Modelling non-normal data: The relationship between the skew-normal factor model and the quadratic factor model.

    Science.gov (United States)

    Smits, Iris A M; Timmerman, Marieke E; Stegeman, Alwin

    2016-05-01

    Maximum likelihood estimation of the linear factor model for continuous items assumes normally distributed item scores. We consider deviations from normality by means of a skew-normally distributed factor model or a quadratic factor model. We show that the item distributions under a skew-normal factor are equivalent to those under a quadratic model up to third-order moments. The reverse only holds if the quadratic loadings are equal to each other and within certain bounds. We illustrate that observed data which follow any skew-normal factor model can be so well approximated with the quadratic factor model that the models are empirically indistinguishable, and that the reverse does not hold in general. The choice between the two models to account for deviations of normality is illustrated by an empirical example from clinical psychology. © 2015 The British Psychological Society.

  14. Performance analysis of parallel supernodal sparse LU factorization

    Energy Technology Data Exchange (ETDEWEB)

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  15. Determining Dimensionality of Exercise Readiness Using Exploratory Factor Analysis.

    Science.gov (United States)

    Strohacker, Kelley; Zakrajsek, Rebecca A

    2016-06-01

    Assessment of "exercise readiness" is a central component to the flexible non-linear periodization (FNLP) method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1) generation of item pool (n = 290), 2) assessment of face validity and refinement of item pool (n = 168), and 3) exploratory factor analysis (n = 684). A principal axis factor analysis was conducted with 41 items using oblique rotation (promax). Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived). Factor 2 items related to physical fatigue (e.g. tired, drained). Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick) and health (i.e. healthy, fit), respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness. Key pointsAssessment of exercise readiness is a key component in implementing an exercise program based on flexible nonlinear periodization, but the dimensionality of this concept has not been empirically determined.Based on a series of surveys and a robust exploratory factor analysis

  16. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  17. What Is Rotating in Exploratory Factor Analysis?

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2015-01-01

    Full Text Available Exploratory factor analysis (EFA is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what - rotation- is, what exactly is rotating, and why we use rotation when performing EFAs. Some commentary about the relative utility and desirability of different rotation methods concludes the narrative.

  18. Biosphere dose conversion Factor Importance and Sensitivity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-10-15

    This report presents importance and sensitivity analysis for the environmental radiation model for Yucca Mountain, Nevada (ERMYN). ERMYN is a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis concerns the output of the model, biosphere dose conversion factors (BDCFs) for the groundwater, and the volcanic ash exposure scenarios. It identifies important processes and parameters that influence the BDCF values and distributions, enhances understanding of the relative importance of the physical and environmental processes on the outcome of the biosphere model, includes a detailed pathway analysis for key radionuclides, and evaluates the appropriateness of selected parameter values that are not site-specific or have large uncertainty.

  19. Scale Factor Self-Dual Cosmological Models

    CERN Document Server

    dS, U Camara; Sotkov, G M

    2015-01-01

    We implement a conformal time scale factor duality for Friedmann-Robertson-Walker cosmological models, which is consistent with the weak energy condition. The requirement for self-duality determines the equations of state for a broad class of barotropic fluids. We study the example of a universe filled with two interacting fluids, presenting an accelerated and a decelerated period, with manifest UV/IR duality. The associated self-dual scalar field interaction turns out to coincide with the "radiation-like" modified Chaplygin gas models. We present an equivalent realization of them as gauged K\\"ahler sigma models (minimally coupled to gravity) with very specific and interrelated K\\"ahler- and super-potentials. Their applications in the description of hilltop inflation and also as quintessence models for the late universe are discussed.

  20. Analysis on Factors Affecting Chinese Independent Innovation Based on SVAR Model%影响我国自主创新因素的SVAR分析

    Institute of Scientific and Technical Information of China (English)

    赵严伟

    2012-01-01

    The capability of independent innovation is not only the core competitiveness of a country, but also an inexhaustible motive force of prosperity. Through the compilation of relevant literature, it was proposed that the factors affecting Chinese independent innovation include the potential innovation resources, the input capability of innovation resources, the capacity-building of innovation vector and the protection capacity of innovation environment. Accordingly, the structure vector autoregressive (SVAR) model was established by stationary test, cointegration test, Granger causality test. Furthermore, impulse response and variance decomposition was applied to analyze the impact of various factors on innovation. Finally,some useful and constructive policy recommendations for the decision-making departments are provided.%自主创新能力是一个国家的核心竞争力,也是一个国家兴旺发达的不竭动力.通过对相关文献的整理分析,认为影响我国自主创新的因素包括潜在自主创新资源、创新资源的投入能力、创新载体的自主创新建设能力以及创新环境的保障能力.在此基础上通过平稳性检验、协整检验、格兰杰因果检验,建立结构向量自回归(SVAR)模型,利用脉冲响应和方差分解技术分析各个因素对自主创新的影响,最后为决策部门提供可供参考且具有建设性的政策建议.

  1. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  2. Human Factors Engineering Program Review Model

    Science.gov (United States)

    2004-02-01

    AA NUREG -0711,Rev. 2 Human Factors Engineering Program Review Model 20081009191 I i m To] Bi U.S. Nuclear Regulatory Commission Office of...Material As of November 1999, you may electronically access NUREG -series publications and other NRC records at NRC’s Public Electronic Reading Room at...http://www.nrc.qov/readinq-rm.html. Publicly released records include, to name a few, NUREG -series publications; Federal Register notices; applicant

  3. Higher-order models versus direct hierarchical models: g as superordinate or breadth factor?

    Directory of Open Access Journals (Sweden)

    GILLES E. GIGNAC

    2008-03-01

    Full Text Available Intelligence research appears to have overwhelmingly endorsed a superordinate (higher-order model conceptualization of g, in comparison to the relatively less well-known breadth conceptualization of g, as represented by the direct hierarchical model. In this paper, several similarities and distinctions between the indirect and direct hierarchical models are delineated. Based on the re-analysis of five correlation matrices, it was demonstrated via CFA that the conventional conception of g as a higher-order superordinate factor was likely not as plausible as a first-order breadth factor. The results are discussed in light of theoretical advantages of conceptualizing g as a first-order factor. Further, because the associations between group-factors and g are constrained to zero within a direct hierarchical model, previous observations of isomorphic associations between a lower-order group factor and g are questioned.

  4. Analysis of nanomechanical properties of Borrelia burgdorferi spirochetes under the influence of lytic factors in an in vitro model using atomic force microscopy.

    Science.gov (United States)

    Tokarska-Rodak, Małgorzata; Kozioł-Montewka, Maria; Skrzypiec, Krzysztof; Chmielewski, Tomasz; Mendyk, Ewaryst; Tylewska-Wierzbanowska, Stanisława

    2015-11-12

    Atomic force microscopy (AFM) is an experimental technique which recently has been used in biology, microbiology, and medicine to investigate the topography of surfaces and in the evaluation of mechanical properties of cells. The aim of this study was to evaluate the influence of the complement system and specific anti-Borrelia antibodies in in vitro conditions on the modification of nanomechanical features of B. burgdorferi B31 cells. In order to assess the influence of the complement system and anti-Borrelia antibodies on B. burgdorferi s.s. B31 spirochetes, the bacteria were incubated together with plasma of identified status. The samples were applied on the surface of mica disks. Young's modulus and adhesive forces were analyzed with a NanoScope V, MultiMode 8 AFM microscope (Bruker) by the PeakForce QNM technique in air using NanoScope Analysis 1.40 software (Bruker). The average value of flexibility of spirochetes' surface expressed by Young's modulus was 10185.32 MPa, whereas the adhesion force was 3.68 nN. AFM is a modern tool with a broad spectrum of observational and measurement abilities. Young's modulus and the adhesion force can be treated as parameters in the evaluation of intensity and changes which take place in pathogenic microorganisms under the influence of various lytic factors. The visualization of the changes in association with nanomechanical features provides a realistic portrayal of the lytic abilities of the elements of the innate and adaptive human immune system.

  5. Gas-Phase Analysis of the Complex of Fibroblast GrowthFactor 1 with Heparan Sulfate: A Traveling Wave Ion Mobility Spectrometry (TWIMS) and Molecular Modeling Study

    Science.gov (United States)

    Zhao, Yuejie; Singh, Arunima; Xu, Yongmei; Zong, Chengli; Zhang, Fuming; Boons, Geert-Jan; Liu, Jian; Linhardt, Robert J.; Woods, Robert J.; Amster, I. Jonathan

    2016-09-01

    Fibroblast growth factors (FGFs) regulate several cellular developmental processes by interacting with cell surface heparan proteoglycans and transmembrane cell surface receptors (FGFR). The interaction of FGF with heparan sulfate (HS) is known to induce protein oligomerization, increase the affinity of FGF towards its receptor FGFR, promoting the formation of the HS-FGF-FGFR signaling complex. Although the role of HS in the signaling pathways is well recognized, the details of FGF oligomerization and formation of the ternary signaling complex are still not clear, with several conflicting models proposed in literature. Here, we examine the effect of size and sulfation pattern of HS upon FGF1 oligomerization, binding stoichiometry and conformational stability, through a combination of ion mobility (IM) and theoretical modeling approaches. Ion mobility-mass spectrometry (IMMS) of FGF1 in the presence of several HS fragments ranging from tetrasaccharide (dp4) to dodecasaccharide (dp12) in length was performed. A comparison of the binding stoichiometry of variably sulfated dp4 HS to FGF1 confirmed the significance of the previously known high-affinity binding motif in FGF1 dimerization, and demonstrated that certain tetrasaccharide-length fragments are also capable of inducing dimerization of FGF1. The degree of oligomerization was found to increase in the presence of dp12 HS, and a general lack of specificity for longer HS was observed. Additionally, collision cross-sections (CCSs) of several FGF1-HS complexes were calculated, and were found to be in close agreement with experimental results. Based on the (CCSs) a number of plausible binding modes of 2:1 and 3:1 FGF1-HS are proposed.

  6. Gas-Phase Analysis of the Complex of Fibroblast GrowthFactor 1 with Heparan Sulfate: A Traveling Wave Ion Mobility Spectrometry (TWIMS) and Molecular Modeling Study

    Science.gov (United States)

    Zhao, Yuejie; Singh, Arunima; Xu, Yongmei; Zong, Chengli; Zhang, Fuming; Boons, Geert-Jan; Liu, Jian; Linhardt, Robert J.; Woods, Robert J.; Amster, I. Jonathan

    2017-01-01

    Fibroblast growth factors (FGFs) regulate several cellular developmental processes by interacting with cell surface heparan proteoglycans and transmembrane cell surface receptors (FGFR). The interaction of FGF with heparan sulfate (HS) is known to induce protein oligomerization, increase the affinity of FGF towards its receptor FGFR, promoting the formation of the HS-FGF-FGFR signaling complex. Although the role of HS in the signaling pathways is well recognized, the details of FGF oligomerization and formation of the ternary signaling complex are still not clear, with several conflicting models proposed in literature. Here, we examine the effect of size and sulfation pattern of HS upon FGF1 oligomerization, binding stoichiometry and conformational stability, through a combination of ion mobility (IM) and theoretical modeling approaches. Ion mobility-mass spectrometry (IMMS) of FGF1 in the presence of several HS fragments ranging from tetrasaccharide (dp4) to dodecasaccharide (dp12) in length was performed. A comparison of the binding stoichiometry of variably sulfated dp4 HS to FGF1 confirmed the significance of the previously known high-affinity binding motif in FGF1 dimerization, and demonstrated that certain tetrasaccharide-length fragments are also capable of inducing dimerization of FGF1. The degree of oligomerization was found to increase in the presence of dp12 HS, and a general lack of specificity for longer HS was observed. Additionally, collision cross-sections (CCSs) of several FGF1-HS complexes were calculated, and were found to be in close agreement with experimental results. Based on the (CCSs) a number of plausible binding modes of 2:1 and 3:1 FGF1-HS are proposed.

  7. Factor analysis identifies subgroups of constipation

    Institute of Scientific and Technical Information of China (English)

    Philip G Dinning; Mike Jones; Linda Hunt; Sergio E Fuentealba; Jamshid Kalanter; Denis W King; David Z Lubowski; Nicholas J Talley; Ian J Cook

    2011-01-01

    AIM: To determine whether distinct symptom groupings exist in a constipated population and whether such grouping might correlate with quantifiable pathophysiological measures of colonic dysfunction. METHODS: One hundred and ninety-one patients presenting to a Gastroenterology clinic with constipation and 32 constipated patients responding to a newspaper advertisement completed a 53-item, wide-ranging selfreport questionnaire. One hundred of these patients had colonic transit measured scintigraphically. Factor analysis determined whether constipation-related symptoms grouped into distinct aspects of symptomatology. Cluster analysis was used to determine whether individual patients naturally group into distinct subtypes. RESULTS: Cluster analysis yielded a 4 cluster solution with the presence or absence of pain and laxative unresponsiveness providing the main descriptors. Amongst all clusters there was a considerable proportion of patients with demonstrable delayed colon transit, irritable bowel syndrome positive criteria and regular stool frequency. The majority of patients with these characteristics also reported regular laxative use. CONCLUSION: Factor analysis identified four constipation subgroups, based on severity and laxative unresponsiveness, in a constipated population. However, clear stratification into clinically identifiable groups remains imprecise.

  8. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    of PCA and related techniques. An interesting dilemma in reduction of dimensionality of data is the desire to obtain simplicity for better understanding, visualization and interpretation of the data on the one hand, and the desire to retain sufficient detail for adequate representation on the other hand......Based on work by Pearson in 1901, Hotelling in 1933 introduced principal component analysis (PCA). PCA is often used for general feature generation and linear orthogonalization or compression by dimensionality reduction of correlated multivariate data, see Jolliffe for a comprehensive description...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  9. Actor-Partner Interdependence Model Analysis of Sexual Communication and Relationship/Family Planning Factors Among Immigrant Latino Couples in the United States.

    Science.gov (United States)

    Matsuda, Yui

    2017-05-01

    The Latino population in the United States is quickly growing, and its unintended pregnancy rate is increasing. To decrease unintended pregnancies, couples must mutually agree on family planning. Communication between partners is one key factor identified in successful family planning for couples. Therefore, the purpose of this study was to examine sexual communication and its associations with sexual relationship power, general communication, and views on family planning. The Actor-Partner Interdependence Model was used to analyze dyadic influences of the chosen variables. Forty immigrant Latino couples were recruited from prenatal care clinics. The study results were grouped according to the three types of power structures: exhibition of men's traditional machismo values, exhibition of women's increased power in their relationships, and exhibition of men's and women's own empowerment with sexual communication. There was a negative association between men's views on family planning and women's sexual communication (exhibition of machismo values); a negative association between women's sexual relationship power and their partners' sexual communication (exhibition of women's increased power); and positive associations between men's and women's general communication and sexual communication (exhibition of men's and women's own empowerment). Dyadic influences of sexual communication and associated variables need to be incorporated into interventions to facilitate family planning for couples.

  10. 基于Tobit回归的融资决策影响因素分析%Analysis on Factors of Financing Decisions Based on Tobit Regression Model

    Institute of Scientific and Technical Information of China (English)

    屈宏志

    2012-01-01

    This paper analyzes the factors of financing decisions of the Chinese A-share listed companies based on Tobit regression model by using the relevant data of its 1990~2009.The results showed that:financing decisions(asset-liability ratio) is positive correlated with the effective income tax rate;asset-liability ratio is positive correlated with company size.But the collateral value of assets,profitability and growth negatively related to asset-liability ratio.The results with the developed countries and other new countries are the same.%运用中国上市公司1990~2009年的有关数据,采用Tobit回归模型分析了中国A股上市公司融资决策的影响因素。结果表明融资决策(资产负债率)与实际所得税率正相关;公司规模与资产负债率正相关,而资产担保价值、获利能力和成长性与资产负债率负相关,这一结果与发达国家和其他新型国家的发现相一致。

  11. Revisiting the Leadership Scale for Sport: Examining Factor Structure Through Exploratory Structural Equation Modeling.

    Science.gov (United States)

    Chiu, Weisheng; Rodriguez, Fernando M; Won, Doyeon

    2016-10-01

    This study examines the factor structure of the shortened version of the Leadership Scale for Sport, through a survey of 201 collegiate swimmers at National Collegiate Athletic Association Division II and III institutions, using both exploratory structural equation modeling and confirmatory factor analysis. Both exploratory structural equation modeling and confirmatory factor analysis showed that a five-factor solution fit the data adequately. The sizes of factor loadings on target factors substantially differed between the confirmatory factor analysis and exploratory structural equation modeling solutions. In addition, the inter-correlations between factors of the Leadership Scale for Sport and the correlations with athletes' satisfaction were found to be inflated in the confirmatory factor analysis solution. Overall, the findings provide evidence of the factorial validity of the shortened Leadership Scale for Sport.

  12. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  13. Biological risk factors for suicidal behaviors: a meta-analysis.

    Science.gov (United States)

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-09-13

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09-1.81) and suicide death (wOR=1.28; CI: 1.13-1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias-cytokines (wOR=2.87; CI: 1.40-5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01-1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors.

  14. Sonar Detection Error Modeling Based on Bayesian Factor Analysis%基于贝叶斯因子分析的声纳探测误差模型研究

    Institute of Scientific and Technical Information of China (English)

    赵宏志; 曹志敏; 韩瑜

    2011-01-01

    The sonar detection error modeling is the core technology in sonar simulation. To model the sonar detection error, after using statistic method to analysis lots of sonar detection data, a method based on Bayesian factor analysis is proposed to find the sonar detection error model. This error model is used to forecast the detection error in a sonar simulation. The result of comparing with conventional Gaussian noise model shows that the proposed model has high performance on the sonar detection data.%声纳的探测误差模型是声纳仿真的核心技术.针对建立某型声纳探测误差模型的问题,在对大量声纳实测数据进行统计分析的基础上,提出采用贝叶斯因子分析的方法,建立声纳的探测误差模型.利用该误差模型预测误差,将预测结果应用于声纳仿真中,并与白噪声模型仿真方法进行比较.仿真结果表明,基于贝叶斯因子分析的声纳误差仿真模型可以更有效的对声纳探测误差进行建模和预测.

  15. Analysis of significant factors for dengue fever incidence prediction.

    Science.gov (United States)

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting

  16. Analysis of Ultra Linguistic Factors in Interpretation

    Institute of Scientific and Technical Information of China (English)

    姚嘉

    2015-01-01

    The quality of interpretation is a dynamic conception, involving a good deal of variables, such as the participants, the situations, working conditions, cultures etc.. Therefore, in interpretation, those static elements, such as traditional grammars and certain linguistic rules can not be counted as the only criteria for the quality of interpretation. That is, there are many other non-language elements—Ultra-linguistic factors that play an important role in interpretation. Ultra-linguistic factors get rid of the bounding of traditional grammar and parole, and reveal the facts in an indirect way. This paper gives a brief analysis of Ultra Lin⁃guistic elements in interpretation in order to achieve better result in interpretation practice.

  17. Chiral analysis of baryon form factors

    Energy Technology Data Exchange (ETDEWEB)

    Gail, T.A.

    2007-11-08

    This work presents an extensive theoretical investigation of the structure of the nucleon within the standard model of elementary particle physics. In particular, the long range contributions to a number of various form factors parametrizing the interactions of the nucleon with an electromagnetic probe are calculated. The theoretical framework for those calculations is chiral perturbation theory, the exact low energy limit of Quantum Chromo Dynamics, which describes such long range contributions in terms of a pion-cloud. In this theory, a nonrelativistic leading one loop order calculation of the form factors parametrizing the vector transition of a nucleon to its lowest lying resonance, the {delta}, a covariant calculation of the isovector and isoscalar vector form factors of the nucleon at next to leading one loop order and a covariant calculation of the isoscalar and isovector generalized vector form factors of the nucleon at leading one loop order are performed. In order to perform consistent loop calculations in the covariant formulation of chiral perturbation theory an appropriate renormalization scheme is defined in this work. All theoretical predictions are compared to phenomenology and results from lattice QCD simulations. These comparisons allow for a determination of the low energy constants of the theory. Furthermore, the possibility of chiral extrapolation, i.e. the extrapolation of lattice data from simulations at large pion masses down to the small physical pion mass is studied in detail. Statistical as well as systematic uncertainties are estimated for all results throughout this work. (orig.)

  18. Factor Rotation and Standard Errors in Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.

    2015-01-01

    In this article, we report a surprising phenomenon: Oblique CF-varimax and oblique CF-quartimax rotation produced similar point estimates for rotated factor loadings and factor correlations but different standard error estimates in an empirical example. Influences of factor rotation on asymptotic standard errors are investigated using a numerical…

  19. Space Station crew safety - Human factors model

    Science.gov (United States)

    Cohen, M. M.; Junge, M. K.

    1984-01-01

    A model of the various human factors issues and interactions that might affect crew safety is developed. The first step addressed systematically the central question: How is this Space Station different from all other spacecraft? A wide range of possible issue was identified and researched. Five major topics of human factors issues that interacted with crew safety resulted: Protocols, Critical Habitability, Work Related Issues, Crew Incapacitation and Personal Choice. Second, an interaction model was developed that would show some degree of cause and effect between objective environmental or operational conditions and the creation of potential safety hazards. The intermediary steps between these two extremes of causality were the effects on human performance and the results of degraded performance. The model contains three milestones: stressor, human performance (degraded) and safety hazard threshold. Between these milestones are two countermeasure intervention points. The first opportunity for intervention is the countermeasure against stress. If this countermeasure fails, performance degrades. The second opportunity for intervention is the countermeasure against error. If this second countermeasure fails, the threshold of a potential safety hazard may be crossed.

  20. Workplace Innovation: Exploratory and Confirmatory Factor Analysis for Construct Validation

    Directory of Open Access Journals (Sweden)

    Wipulanusat Warit

    2017-06-01

    Full Text Available Workplace innovation enables the development and improvement of products, processes and services leading simultaneously to improvement in organisational performance. This study has the purpose of examining the factor structure of workplace innovation. Survey data, extracted from the 2014 APS employee census, comprising 3,125 engineering professionals in the Commonwealth of Australia’s departments were analysed using exploratory factor analysis (EFA and confirmatory factor analysis (CFA. EFA returned a two-factor structure explaining 69.1% of the variance of the construct. CFA revealed that a two-factor structure was indicated as a validated model (GFI = 0.98, AGFI = 0.95, RMSEA = 0.08, RMR = 0.02, IFI = 0.98, NFI = 0.98, CFI = 0.98, and TLI = 0.96. Both factors showed good reliability of the scale (Individual creativity: α = 0.83, CR = 0.86, and AVE = 0.62; Team Innovation: α = 0.82, CR = 0.88, and AVE = 0.61. These results confirm that the two factors extracted for characterising workplace innovation included individual creativity and team innovation.

  1. Shape Modelling Using Maximum Autocorrelation Factors

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2001-01-01

    of the training set are in reality a time series, e.g.\\$\\backslash\\$ snapshots of a beating heart during the cardiac cycle or when the shapes are slices of a 3D structure, e.g. the spinal cord. Second, in almost all applications a natural order of the landmark points along the contour of the shape is introduced......This paper addresses the problems of generating a low dimensional representation of the shape variation present in a training set after alignment using Procrustes analysis and projection into shape tangent space. We will extend the use of principal components analysis in the original formulation...... of Active Shape Models by Timothy Cootes and Christopher Taylor by building new information into the model. This new information consists of two types of prior knowledge. First, in many situation we will be given an ordering of the shapes of the training set. This situation occurs when the shapes...

  2. A quality metric for homology modeling: the H-factor

    Science.gov (United States)

    2011-01-01

    Background The analysis of protein structures provides fundamental insight into most biochemical functions and consequently into the cause and possible treatment of diseases. As the structures of most known proteins cannot be solved experimentally for technical or sometimes simply for time constraints, in silico protein structure prediction is expected to step in and generate a more complete picture of the protein structure universe. Molecular modeling of protein structures is a fast growing field and tremendous works have been done since the publication of the very first model. The growth of modeling techniques and more specifically of those that rely on the existing experimental knowledge of protein structures is intimately linked to the developments of high resolution, experimental techniques such as NMR, X-ray crystallography and electron microscopy. This strong connection between experimental and in silico methods is however not devoid of criticisms and concerns among modelers as well as among experimentalists. Results In this paper, we focus on homology-modeling and more specifically, we review how it is perceived by the structural biology community and what can be done to impress on the experimentalists that it can be a valuable resource to them. We review the common practices and provide a set of guidelines for building better models. For that purpose, we introduce the H-factor, a new indicator for assessing the quality of homology models, mimicking the R-factor in X-ray crystallography. The methods for computing the H-factor is fully described and validated on a series of test cases. Conclusions We have developed a web service for computing the H-factor for models of a protein structure. This service is freely accessible at http://koehllab.genomecenter.ucdavis.edu/toolkit/h-factor. PMID:21291572

  3. Functional dynamic factor models with application to yield curve forecasting

    KAUST Repository

    Hays, Spencer

    2012-09-01

    Accurate forecasting of zero coupon bond yields for a continuum of maturities is paramount to bond portfolio management and derivative security pricing. Yet a universal model for yield curve forecasting has been elusive, and prior attempts often resulted in a trade-off between goodness of fit and consistency with economic theory. To address this, herein we propose a novel formulation which connects the dynamic factor model (DFM) framework with concepts from functional data analysis: a DFM with functional factor loading curves. This results in a model capable of forecasting functional time series. Further, in the yield curve context we show that the model retains economic interpretation. Model estimation is achieved through an expectation- maximization algorithm, where the time series parameters and factor loading curves are simultaneously estimated in a single step. Efficient computing is implemented and a data-driven smoothing parameter is nicely incorporated. We show that our model performs very well on forecasting actual yield data compared with existing approaches, especially in regard to profit-based assessment for an innovative trading exercise. We further illustrate the viability of our model to applications outside of yield forecasting.

  4. Analysis of transfer reactions: determination of spectroscopic factors

    Energy Technology Data Exchange (ETDEWEB)

    Keeley, N. [CEA Saclay, Dept. d' Astrophysique, de Physique des Particules de Physique Nucleaire et de l' Instrumentation Associee (DSM/DAPNIA/SPhN), 91- Gif sur Yvette (France); The Andrzej So an Institute for Nuclear Studies, Dept. of Nuclear Reactions, Warsaw (Poland)

    2007-07-01

    An overview of the most popular models used for the analysis of direct reaction data is given, concentrating on practical aspects. The 4 following models (in order of increasing sophistication): the distorted wave born approximation (DWBA), the adiabatic model, the coupled channels born approximation, and the coupled reaction channels are briefly described. As a concrete example, the C{sup 12}(d,p)C{sup 13} reaction at an incident deuteron energy of 30 MeV is analysed with progressively more physically sophisticated models. The effect of the choice of the reaction model on the spectroscopic information extracted from the data is investigated and other sources of uncertainty in the derived spectroscopic factors are discussed. We have showed that the choice of the reaction model can significantly influence the nuclear structure information, particularly the spectroscopic factors or amplitudes but occasionally also the spin-parity, that we wish to extract from direct reaction data. We have also demonstrated that the DWBA can fail to give a satisfactory description of transfer data but when the tenets of the theory are fulfilled DWBA can work very well and will yield the same results as most sophisticated models. The use of global rather than fitted optical potentials can also lead to important differences in the extracted spectroscopic factors.

  5. Five-Factor Model of Personality, Work Behavior Self-Efficacy, and Length of Prior Employment for Individuals with Disabilities: An Exploratory Analysis

    Science.gov (United States)

    O'Sullivan, Deirdre; Strauser, David R.; Wong, Alex W. K.

    2012-01-01

    With the continued lower employment rate for persons with disabilities, researchers are focusing more on barriers to employment that reach beyond functional impairment. Personality and self-efficacy have consistently been important factors when considering employment outcomes for persons without disability; less is known about these factors as…

  6. Five-Factor Model of Personality, Work Behavior Self-Efficacy, and Length of Prior Employment for Individuals with Disabilities: An Exploratory Analysis

    Science.gov (United States)

    O'Sullivan, Deirdre; Strauser, David R.; Wong, Alex W. K.

    2012-01-01

    With the continued lower employment rate for persons with disabilities, researchers are focusing more on barriers to employment that reach beyond functional impairment. Personality and self-efficacy have consistently been important factors when considering employment outcomes for persons without disability; less is known about these factors as…

  7. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    Science.gov (United States)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  8. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    Science.gov (United States)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  9. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    Science.gov (United States)

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  10. The Interpersonal Style Inventory and the five-factor model.

    Science.gov (United States)

    Lorr, M; Youniss, R P; Kluth, C

    1992-03-01

    The study examined relations between the 15 scale scores of the Interpersonal Style Inventory (Lorr & Youniss, 1985) and the domain measures of the five-factor model provided by the NEO Personality Inventory (Costa & McCrae, 1985). A sample of 236 college students were administered both inventories. A principal component analysis of the 5 NEO-PI domain scores and the 15 ISI scale scores followed by a Varimax rotation disclosed the expected five higher-order factors. Four factors, Neuroticism, Extraversion, Conscientiousness and Ageeableness, were defined by both NEO and ISI scales. Openness to Experience, however, was represented in the ISI by Independence and Directiveness, which define its Autonomy dimension. Thus, the ISI measures four of the five factors assessed by the NEO-PI.

  11. Analysis of Business Models

    Directory of Open Access Journals (Sweden)

    Slavik Stefan

    2014-12-01

    Full Text Available The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resources, activities, cost and revenue. In the last part of our paper, we discuss the most used characteristics, extremes, discrepancies and the most important facts which were detected in our research.

  12. Analysis on some factors affecting MIMO in tunnel

    Science.gov (United States)

    Zheng, Hong-dang; Nie, Xiao-Yan; Xu, Zhao

    2009-07-01

    Based on the 3D-GBSB (three-dimensional Geometrically Based Single-Bounce) model and MIMO channel capacity function, by geometric analysis, it is analyzed that transceiver antenna arrays, antenna spacing, antenna array angle, SNR and Rician K-factor and so on impact on the frequency-nonselective fading MIMO channel capacity. Monte Carlo method can be applied to stimulate the wireless fading channel and demonstrate Cumulative Distribution Function of above.

  13. A model for sigma factor competition in bacterial cells.

    Science.gov (United States)

    Mauri, Marco; Klumpp, Stefan

    2014-10-01

    Sigma factors control global switches of the genetic expression program in bacteria. Different sigma factors compete for binding to a limited pool of RNA polymerase (RNAP) core enzymes, providing a mechanism for cross-talk between genes or gene classes via the sharing of expression machinery. To analyze the contribution of sigma factor competition to global changes in gene expression, we develop a theoretical model that describes binding between sigma factors and core RNAP, transcription, non-specific binding to DNA and the modulation of the availability of the molecular components. The model is validated by comparison with in vitro competition experiments, with which excellent agreement is found. Transcription is affected via the modulation of the concentrations of the different types of holoenzymes, so saturated promoters are only weakly affected by sigma factor competition. However, in case of overlapping promoters or promoters recognized by two types of sigma factors, we find that even saturated promoters are strongly affected. Active transcription effectively lowers the affinity between the sigma factor driving it and the core RNAP, resulting in complex cross-talk effects. Sigma factor competition is not strongly affected by non-specific binding of core RNAPs, sigma factors and holoenzymes to DNA. Finally, we analyze the role of increased core RNAP availability upon the shut-down of ribosomal RNA transcription during the stringent response. We find that passive up-regulation of alternative sigma-dependent transcription is not only possible, but also displays hypersensitivity based on the sigma factor competition. Our theoretical analysis thus provides support for a significant role of passive control during that global switch of the gene expression program.

  14. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  15. Heterogeneity of Capital Stocks in Japan: Classification by Factor Analysis

    Directory of Open Access Journals (Sweden)

    Konomi Tonogi

    2014-04-01

    Full Text Available This paper examines the heterogeneity of capital stocks using financial statement data of publicly listed Japanese firms. We conduct factor analysis on investment rates among various capital goods and estimate factor loadings of each as its reactions to common factors like total factor productivity (TFP shocks. Then we estimate the uniqueness for each investment rate, which is the percentage of its variance that is not explained by the common factors. If the estimated factor loadings are similar between some of the heterogeneous capital goods, it may well imply that the adjustment cost structure of these investments is also similar. Further, if some of the estimated values of uniqueness are small, it suggests that certain theoretical models may track the dynamics of the investment rates well. Our estimation results show that Building and Structure have similar factor loadings as do Machinery & Equipment, Vehicles & Delivery Equipment, and Tools, Furniture, & Fixture. This suggests that we could remedy the Curse of Dimensionality by bundling the investments that have similar factor loadings together and that identifying the functional structures of each group of capital goods can greatly improve the performance of empirical investment equations.

  16. Gas-Phase Analysis of the Complex of Fibroblast GrowthFactor 1 with Heparan Sulfate : A Traveling Wave Ion Mobility Spectrometry (TWIMS) and Molecular Modeling Study

    NARCIS (Netherlands)

    Zhao, Yuejie; Singh, Arunima; Xu, Yongmei; Zong, Chengli; Zhang, Fuming; Boons, Geert-Jan; Liu, Jian; Linhardt, Robert J; Woods, Robert J; Amster, I Jonathan

    2016-01-01

    Fibroblast growth factors (FGFs) regulate several cellular developmental processes by interacting with cell surface heparan proteoglycans and transmembrane cell surface receptors (FGFR). The interaction of FGF with heparan sulfate (HS) is known to induce protein oligomerization, increase the affinit

  17. Nonparametric Bayesian Sparse Factor Models with application to Gene Expression modelling

    CERN Document Server

    Knowles, David

    2010-01-01

    A nonparametric Bayesian extension of Factor Analysis (FA) is proposed where observed data Y is modeled as a linear superposition, G, of a potentially infinite number of hidden factors, X. The Indian Buffet Process (IBP) is used as a prior on G to incorporate sparsity and to allow the number of latent features to be inferred. The model's utility for modeling gene expression data is investigated using randomly generated datasets based on a known sparse connectivity matrix for E. Coli, and on three biological datasets of increasing complexity.

  18. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  19. Analysis on Factors Influencing Unmarried Adolescent Sex Behavior in Shanghai City under the Multilevel Model%上海市未婚青少年性行为影响因素多水平模型分析

    Institute of Scientific and Technical Information of China (English)

    张鹏; 楼超华; 高尔生

    2012-01-01

    Objective To explore influencing factors of sex behavior among unmarried adolescent in Shanghai under the multilevel model. Methods We chose unmarried adolescents aged 15 to 24 among 60 neighborhood/village committees in Shanghai through stratified random sampling. Then a model of 2 levels logistic was fitted. Results The prev-alence of sex behavior among unmarried adolescents was 12. 9% , including 16. 7% for male and 8. 9% for female. According to the results of the 2 level logistic model analysis, we learn that the protective factors of sex be-havior among unmarried adolescent were education, student status and the score of school sex information respectively. The risk factors were age,gen-der , score of dangerous behavior, score of peer or media sex information, sex pressure from peer and pornographic information contacting. Conclu-sion According to the analysis of influencing factors of sex behavior a-mong unmarried adolescent under the multilevel model, customized inter-vention measures based on these factors could be designed for adolescents.%目的 通过建立多水平模型,探讨上海市未婚青少年性行为现况及影响因素.方法 采用分层随机抽样方法选取上海市辖区内60个居/村委会15 ~24岁青少年进行调查,并对数据进行两水平logistic模型拟合.结果 上海市未婚青少年性行为发生率为12.9%,其中男性l6 7%,女性8 9%.未婚青少年性行为影响因素两水平logistic回归模型结果显示,文化程度、在读学生和学校性信息评分是青少年性行为保护因素;年龄、性别、危险行为评分、同伴和媒体性信息评分、同伴性压力和黄色信息是性行为危险因素.结论 可根据多水平模型对上海市未婚青少年性行为影响因素的分析,对青少年开展有针对性的生殖健康干预.

  20. Factorized molecular wave functions: Analysis of the nuclear factor

    Energy Technology Data Exchange (ETDEWEB)

    Lefebvre, R., E-mail: roland.lefebvre@u-psud.fr [Institut des Sciences Moléculaires d’ Orsay, Bâtiment 350, UMR8214, CNRS- Université. Paris-Sud, 91405 Orsay, France and Sorbonne Universités, UPMC Univ Paris 06, UFR925, F-75005 Paris (France)

    2015-06-07

    The exact factorization of molecular wave functions leads to nuclear factors which should be nodeless functions. We reconsider the case of vibrational perturbations in a diatomic species, a situation usually treated by combining Born-Oppenheimer products. It was shown [R. Lefebvre, J. Chem. Phys. 142, 074106 (2015)] that it is possible to derive, from the solutions of coupled equations, the form of the factorized function. By increasing artificially the interstate coupling in the usual approach, the adiabatic regime can be reached, whereby the wave function can be reduced to a single product. The nuclear factor of this product is determined by the lowest of the two potentials obtained by diagonalization of the potential matrix. By comparison with the nuclear wave function of the factorized scheme, it is shown that by a simple rectification, an agreement is obtained between the modified nodeless function and that of the adiabatic scheme.

  1. CMS Analysis School Model

    Energy Technology Data Exchange (ETDEWEB)

    Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY

    2014-01-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  2. Comparing factor analytic models of the DSM-IV personality disorders.

    Science.gov (United States)

    Huprich, Steven K; Schmitt, Thomas A; Richard, David C S; Chelminski, Iwona; Zimmerman, Mark A

    2010-01-01

    There is little agreement about the latent factor structure of the Diagnostic and Statistical Manual of Mental Disorders (DSM) personality disorders (PDs). Factor analytic studies over the past 2 decades have yielded different results, in part reflecting differences in factor analytic technique, the measure used to assess the PDs, and the changing DSM criteria. In this study, we explore the latent factor structure of the DSM (4th ed.; IV) PDs in a sample of 1200 psychiatric outpatients evaluated with the Structured Interview for DSM-IV PDs (B. Pfohl, N. Blum, & M. Zimmerman, 1997). We first evaluated 2 a priori models of the PDs with confirmatory factor analysis (CFA), reflecting their inherent organization in the DSM-IV: a 3-factor model and a 10-factor model. Fit statistics did not suggest that these models yielded an adequate fit. We then evaluated the latent structure with exploratory factor analysis (EFA). Multiple solutions produced more statistically and theoretically reasonable results, as well as providing clinically useful findings. On the basis of fit statistics and theory, 3 models were evaluated further--the 4-, 5-, and 10-factor models. The 10-factor model, which did not resemble the 10-factor model of the CFA, was determined to be the strongest of all 3 models. Future research should use contemporary methods of evaluating factor analytic results in order to more thoroughly compare various factor solutions.

  3. Design of financial results on the basis of factor analysis

    Directory of Open Access Journals (Sweden)

    V. M. Sidorov

    2016-01-01

    Full Text Available Approaches are presented in the article, qualificatory the design of financial results on the basis of factor analysis. Connection marks between development of enterprises and scientific and technical progress. It is shown that potential of enterprises can be based on the design of financial results. Actuality of this direction of research marks on a background the entered approvals against Russia. A control role is indicated at the effective use of resources. It is well-proven that a key moment in forming of profit of enterprise is a design of financial results. A factor analysis is reasonable at the estimation of activity of organization, which exposed an application of this determined model domain. is sent to capital augmentation. The role of the financial system and separate elements is shown in to financially-economic activity. Connection is set between a design and structure of the investigated object. A design cannot come true without determination of structure of the investigated object, because a structure characterizes steady connections between elements. The components of elements are able in totality to counteract to external influences. For expansion of the system to distinguish the most meaningful factors. To distinguish integrating connections from appropriate. Distinguished, that factors due to co-operating with each other can create a synergistically or extinguishing effect or neutralize each other. Factors will organize or will disorganize work of organization. The design of objects is built on the certain terms, and separate parties of object are examined at abandonment from the less meaningful elements of the system. The role of empiric and theoretical level of research in the design of activity of organization is shown. An author suggested to use the determined model for determination of functional connections between a resulting index and factor signs. Exposed an application of this determined model domain.

  4. Factors Effecting Unemployment: A Cross Country Analysis

    Directory of Open Access Journals (Sweden)

    Aurangzeb

    2013-01-01

    Full Text Available This paper investigates macroeconomic determinants of the unemployment for India, China and Pakistan for the period 1980 to 2009. The investigation was conducted through co integration, granger causality and regression analysis. The variables selected for the study are unemployment, inflation, gross domestic product, exchange rate and the increasing rate of population. The results of regression analysis showed significant impact of all the variables for all three countries. GDP of Pakistan showed positive relation with the unemployment rate and the reason of that is the poverty level and underutilization of foreign investment. The result of granger causality showed that bidirectional causality does not exist between any of the variable for all three countries. Co integration result explored that long term relationship do exist among the variables for all the models. It is recommended that distribution of income needs to be improved for Pakistan in order to have positive impact of growth on the employment rate.

  5. [Cultural regionalization for Notopterygium incisum based on 3S technology platform. I. Evaluation for growth suitability for N. incisum based on ecological factors analysis by Maxent and ArcGIS model].

    Science.gov (United States)

    Sun, Hong-bing; Sun, Hui; Jiang, Shun-yuan; Zhou, Yi; Cao, Wen-long; Ji, Ming-chang; Zhy, Wen-tao; Yan, Han-jing

    2015-03-01

    Growth suitability as assessment indicators for medicinal plants cultivation was proposed based on chemical quality determination and ecological factors analysis by Maxent and ArcGIS model. Notopterygium incisum, an endangered Chinese medicinal plant, was analyzed as a case, its potential distribution areas at different suitability grade and regionalization map were formulated based on growth suitability theory. The results showed that the most suitable habitats is Sichuan province, and more than 60% of the most suitable areawas located in the western Sichuan such as Aba and Ganzi prefectures for N. incisum. The results indicated that habitat altitude, average air temperature in September, and vegetation types were the dominant factors contributing to the grade of plant growth, precipitation and slope were the major factors contributing to notopterol accumulation in its underground parts, while isoimperatorin in its underground parts was negatively corelated with precipitation and slope of its habitat. However, slope as a factor influencing chemical components seemed to be a pseudo corelationship. Therefore, there were distinguishing differences between growth suitability and quality suitability for medicinal plants, which was helpful to further research and practice of cultivation regionalization, wild resource monitoring and large-scale cultivation of traditional Chinese medicine plants.

  6. Examining the factor structure and convergent and discriminant validity of the Levenson self-report psychopathy scale: is the two-factor model the best fitting model?

    Science.gov (United States)

    Salekin, Randall T; Chen, Debra R; Sellbom, Martin; Lester, Whitney S; MacDougall, Emily

    2014-07-01

    The Levenson, Kiehl, and Fitzpatrick (1995) Self-Report Psychopathy Scale (LSRP) was introduced in the mid-1990s as a brief measure of psychopathy and has since gained considerable popularity. Despite its attractiveness as a brief psychopathy tool, the LSRP has received limited research regarding its factor structure and convergent and discriminant validity. The present study examined the construct validity of the LSRP, testing both its factor structure and the convergent and discriminant validity. Using a community sample of 1,257 undergraduates (869 females; 378 males), we tested whether a 1-, 2-, or 3-factor model best fit the data and examined the links between the resultant factor structures and external correlates. Confirmatory factor analysis (CFA) findings revealed a 3-factor model best matched the data, followed by an adequate-fitting original 2-factor model. Next, comparisons were made regarding the convergent and discriminant validity of the competing 2- and 3-factor models. Findings showed the LSRP traditional primary and secondary factors had meaningful relations with extratest variables such as neuroticism, stress tolerance, and lack of empathy. The 3-factor model showed particular problems with the Callousness scale. These findings underscore the importance of examining not only CFA fit statistics but also convergent and discriminant validity when testing factor structure models. The current findings suggest that the 2-factor model might still be the best way to interpret the LSRP. (c) 2014 APA, all rights reserved.

  7. Factorial invariance in multilevel confirmatory factor analysis.

    Science.gov (United States)

    Ryu, Ehri

    2014-02-01

    This paper presents a procedure to test factorial invariance in multilevel confirmatory factor analysis. When the group membership is at level 2, multilevel factorial invariance can be tested by a simple extension of the standard procedure. However level-1 group membership raises problems which cannot be appropriately handled by the standard procedure, because the dependency between members of different level-1 groups is not appropriately taken into account. The procedure presented in this article provides a solution to this problem. This paper also shows Muthén's maximum likelihood (MUML) estimation for testing multilevel factorial invariance across level-1 groups as a viable alternative to maximum likelihood estimation. Testing multilevel factorial invariance across level-2 groups and testing multilevel factorial invariance across level-1 groups are illustrated using empirical examples. SAS macro and Mplus syntax are provided.

  8. Physiological Factors Analysis in Unpressurized Aircraft Cabins

    Science.gov (United States)

    Patrao, Luis; Zorro, Sara; Silva, Jorge

    2016-11-01

    Amateur and sports flight is an activity with growing numbers worldwide. However, the main cause of flight incidents and accidents is increasingly pilot error, for a number of reasons. Fatigue, sleep issues and hypoxia, among many others, are some that can be avoided, or, at least, mitigated. This article describes the analysis of psychological and physiological parameters during flight in unpressurized aircraft cabins. It relates cerebral oximetry and heart rate with altitude, as well as with flight phase. The study of those parameters might give clues on which variations represent a warning sign to the pilot, thus preventing incidents and accidents due to human factors. Results show that both cerebral oximetry and heart rate change along the flight and altitude in the alert pilot. The impaired pilot might not reveal these variations and, if this is detected, he can be warned in time.

  9. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    Science.gov (United States)

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  10. Comparison of One-, Two-, and Three-Factor Models of Personal Resiliency Using the Resiliency Scales for Children and Adolescents

    Science.gov (United States)

    Prince-Embury, Sandra; Courville, Troy

    2008-01-01

    This article examines the scale structure of the Resiliency Scales for Children and Adolescents (RSCA). Confirmatory factor analysis reveals that a three-factor model is a better fit than one- or two-factor models for the normative sample. These findings lend support to the construct validity of the RSCA. The three-factor model is discussed as a…

  11. Studying Effective Factors on Corporate Entrepreneurship: Representing a Model

    Directory of Open Access Journals (Sweden)

    Maryam Soleimani

    2013-02-01

    Full Text Available Development and advancement of current organizations depends on Corporate Entrepreneurship (CE and its anticipants considerably. Therefore purpose of conducting this survey is to study effective factors on corporate entrepreneurship (personal characteristics of entrepreneurship, human resource practices, organizational culture and employees' satisfaction. This survey was conducted using descriptive-field methodology. Statistical population included managers and experts of Hexa Consulting Engineers Company (Tehran/Iran and the sample consisted of forty seven of them. Questionnaire was tool of data collection. Data was collected in cross-sectional form in July-August 2011. Descriptive and inferential (spearman correlation statistics methods were used for data analysis. According to results, there is a positive significant relationship among all factors (personal characteristics of entrepreneurship, human resource practices, organizational culture and employees' satisfaction and corporate entrepreneurship. In other words, the proposed variables as effective factors on corporate entrepreneurship were confirmed in conceptual model of survey.

  12. Systems pharmacology of the nerve growth factor pathway: use of a systems biology model for the identification of key drug targets using sensitivity analysis and the integration of physiology and pharmacology.

    Science.gov (United States)

    Benson, Neil; Matsuura, Tomomi; Smirnov, Sergey; Demin, Oleg; Jones, Hannah M; Dua, Pinky; van der Graaf, Piet H

    2013-04-06

    The nerve growth factor (NGF) pathway is of great interest as a potential source of drug targets, for example in the management of certain types of pain. However, selecting targets from this pathway either by intuition or by non-contextual measures is likely to be challenging. An alternative approach is to construct a mathematical model of the system and via sensitivity analysis rank order the targets in the known pathway, with respect to an endpoint such as the diphosphorylated extracellular signal-regulated kinase concentration in the nucleus. Using the published literature, a model was created and, via sensitivity analysis, it was concluded that, after NGF itself, tropomyosin receptor kinase A (TrkA) was one of the most sensitive druggable targets. This initial model was subsequently used to develop a further model incorporating physiological and pharmacological parameters. This allowed the exploration of the characteristics required for a successful hypothetical TrkA inhibitor. Using these systems models, we were able to identify candidates for the optimal drug targets in the known pathway. These conclusions were consistent with clinical and human genetic data. We also found that incorporating appropriate physiological context was essential to drawing accurate conclusions about important parameters such as the drug dose required to give pathway inhibition. Furthermore, the importance of the concentration of key reactants such as TrkA kinase means that appropriate contextual data are required before clear conclusions can be drawn. Such models could be of great utility in selecting optimal targets and in the clinical evaluation of novel drugs.

  13. Analysis of random factors of the self-education process

    Directory of Open Access Journals (Sweden)

    A. A. Solodov

    2016-01-01

    Full Text Available The aim of the study is the statistical description of the random factors of the self-educationт process, namely that stage of the process of continuous education, in which there is no meaningful impact on the student’s educational organization and the development of algorithms for estimating these factors. It is assumed that motivations of self-education are intrinsic factors that characterize the individual learner and external, associated with the changing environment and emerging challenges. Phenomena available for analysis a self-learning process (observed data are events relevant to this process, which are modeled by points on the time axis, the number and position of which is assumed to be random. Each point can be mapped with the unknown and unobserved random or nonrandom factor (parameter which affects the intensity of formation of dots. The purpose is to describe observable and unobservable data and developing algorithms for optimal evaluation. Further, such evaluations can be used for the individual characteristics of the process of self-study or for comparison of different students. For the analysis of statistical characteristics of the process of selfeducation applied mathematical apparatus of the theory of point random processes, which allows to determine the key statistical characteristics of unknown random factors of the process of self-education. The work consists of a logically complete model including the following components.• Study the basic statistical model of the appearance of points in the process of self-education in the form of a Poisson process, the only characteristic is the intensity of occurrence of events• Methods of testing the hypothesis about Poisson distribution of observed events.• Generalization of the basic model to the case where the intensity function depends on the time and unknown factor (variable can be both random and not random. Such factors are interpreted as

  14. Multiscale Signal Analysis and Modeling

    CERN Document Server

    Zayed, Ahmed

    2013-01-01

    Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...

  15. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan; Spell, Gregory; Carin, Lawrence

    2017-04-20

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rank impacts both overcompleteness and sparsity.

  16. ANALYSIS OF FACTORS WHICH AFFECTING THE ECONOMIC GROWTH

    Directory of Open Access Journals (Sweden)

    Suparna Wijaya

    2017-03-01

    Full Text Available High economic growth and sustainable process are main conditions for sustainability of economic country development. They are also become measures of the success of the country's economy. Factors which tested in this study are economic and non-economic factors which impacting economic development. This study has a goal to explain the factors that influence on macroeconomic Indonesia. It used linear regression modeling approach. The analysis result showed that Tax Amnesty, Exchange Rate, Inflation, and interest rate, they jointly can bring effect which amounted to 77.6% on economic growth whereas the remaining 22.4% is the influenced by other variables which not observed in this study. Keywords: tax amnesty, exchange rates, inflation, SBI and economic growth

  17. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Clinton Lum

    2002-02-04

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3

  18. The Italian version of the Junior Eysenck Personality Questionnaire: a confirmatory factor analysis.

    Science.gov (United States)

    Vidotto, Giulio; Cioffi, Raffaele; Saggino, Aristide; Wilson, Glenn

    2008-12-01

    An experimental version of the Italian Junior Eysenck Personality Questionnaire with a 5-point scale was administered to a group of 1,000 high school students, 200 within each age group from 11 to 15 years. Following a previous exploratory factor analysis, which yielded a fourth factor in addition to the original three, the aim of the present research was to study the factor structure of the Italian version using confirmatory factor analysis. Three models were tested, a three-factor orthogonal model, a three-factor oblique model, and a four-factor model based on an a priori separation of extraversion items into two sets. None of the considered models converged satisfactorily. An interpretation of the results was proposed.

  19. Constrained three-mode factor analysis as a tool for parameter estimation with second-order instrumental data

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Smilde, Age K.

    1998-01-01

    Three-mode factor analysis models are often used in exploratory analysis of three-way data. However, in some situations it is a priori known that a particular constrained three-mode factor analysis (C3MFA) model describes an underlying process exactly. In such situations, fitting a C3MFA model to a

  20. How Do Executive Functions Fit with the Cattell-Horn-Carroll Model? Some Evidence from a Joint Factor Analysis of the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities

    Science.gov (United States)

    Floyd, Randy G.; Bergeron, Renee; Hamilton, Gloria; Parra, Gilbert R.

    2010-01-01

    This study investigated the relations among executive functions and cognitive abilities through a joint exploratory factor analysis and joint confirmatory factor analysis of 25 test scores from the Delis-Kaplan Executive Function System and the Woodcock-Johnson III Tests of Cognitive Abilities. Participants were 100 children and adolescents…

  1. Confirmatory factor analysis of the Sport Organizational Effectiveness Scale.

    Science.gov (United States)

    Karteroliotis, Konstantinos; Papadimitriou, Dimitra

    2004-08-01

    The purpose of this study was to examine the factorial validity of the 5-factor model of sport organizational effectiveness developed by Papadimitriou and Taylor. This questionnaire has 33 items which assess five composite effectiveness dimensions pertinent to the operation of sport organizations: calibre of the board and external liaisons, interest in athletes, internal procedures, long term planning, and sport science support. The multiple constituency approach was used as a theoretical framework for developing this scale. Data were obtained from respondents affiliated with 20 Greek national sport organizations with a questionnaire. Analysis indicated that the 5-factor model of effectiveness is workable in assessing the organizational performance of nonprofit sport organizations. The application of the multiple constituency approach in studying sport organizational effectiveness was also suggested.

  2. Confirmatory Factor Analysis of the Social Interest Index

    Directory of Open Access Journals (Sweden)

    Gary K. Leak

    2011-10-01

    Full Text Available Social interest was Alfred Adler’s most important personality trait, and it reflects one’s genuine concern for the welfare of all individuals. Several measures of social interest are available, and the Social Interest Index (SII is one of the most popular in current use. This study is the first to report the results of a confirmatory factor analysis of the SII. Using college students, three models were tested in an effort to find support for the factorial validity of this scale. All analyses showed a poor fit between the theoretical model and scale items. The results paint a fairly negative picture of the factor structure of this important scale.

  3. Evidence for a General Factor Model of ADHD in Adults

    Science.gov (United States)

    Gibbins, Christopher; Toplak, Maggie E.; Flora, David B.; Weiss, Margaret D.; Tannock, Rosemary

    2012-01-01

    Objective: To examine factor structures of "Diagnostic and Statistical Manual of Mental Disorders" (4th ed.) symptoms of ADHD in adults. Method: Two sets of models were tested: (a) models with inattention and hyperactivity/impulsivity as separate but correlated latent constructs and (b) hierarchical general factor models with a general factor for…

  4. Life sciences domain analysis model.

    Science.gov (United States)

    Freimuth, Robert R; Freund, Elaine T; Schick, Lisa; Sharma, Mukesh K; Stafford, Grace A; Suzek, Baris E; Hernandez, Joyce; Hipp, Jason; Kelley, Jenny M; Rokicki, Konrad; Pan, Sue; Buckler, Andrew; Stokes, Todd H; Fernandez, Anna; Fore, Ian; Buetow, Kenneth H; Klemm, Juli D

    2012-01-01

    Meaningful exchange of information is a fundamental challenge in collaborative biomedical research. To help address this, the authors developed the Life Sciences Domain Analysis Model (LS DAM), an information model that provides a framework for communication among domain experts and technical teams developing information systems to support biomedical research. The LS DAM is harmonized with the Biomedical Research Integrated Domain Group (BRIDG) model of protocol-driven clinical research. Together, these models can facilitate data exchange for translational research. The content of the LS DAM was driven by analysis of life sciences and translational research scenarios and the concepts in the model are derived from existing information models, reference models and data exchange formats. The model is represented in the Unified Modeling Language and uses ISO 21090 data types. The LS DAM v2.2.1 is comprised of 130 classes and covers several core areas including Experiment, Molecular Biology, Molecular Databases and Specimen. Nearly half of these classes originate from the BRIDG model, emphasizing the semantic harmonization between these models. Validation of the LS DAM against independently derived information models, research scenarios and reference databases supports its general applicability to represent life sciences research. The LS DAM provides unambiguous definitions for concepts required to describe life sciences research. The processes established to achieve consensus among domain experts will be applied in future iterations and may be broadly applicable to other standardization efforts. The LS DAM provides common semantics for life sciences research. Through harmonization with BRIDG, it promotes interoperability in translational science.

  5. Factor analysis and multiple regression between topography and precipitation on Jeju Island, Korea

    Science.gov (United States)

    Um, Myoung-Jin; Yun, Hyeseon; Jeong, Chang-Sam; Heo, Jun-Haeng

    2011-11-01

    SummaryIn this study, new factors that influence precipitation were extracted from geographic variables using factor analysis, which allow for an accurate estimation of orographic precipitation. Correlation analysis was also used to examine the relationship between nine topographic variables from digital elevation models (DEMs) and the precipitation in Jeju Island. In addition, a spatial analysis was performed in order to verify the validity of the regression model. From the results of the correlation analysis, it was found that all of the topographic variables had a positive correlation with the precipitation. The relations between the variables also changed in accordance with a change in the precipitation duration. However, upon examining the correlation matrix, no significant relationship between the latitude and the aspect was found. According to the factor analysis, eight topographic variables (latitude being the exception) were found to have a direct influence on the precipitation. Three factors were then extracted from the eight topographic variables. By directly comparing the multiple regression model with the factors (model 1) to the multiple regression model with the topographic variables (model 3), it was found that model 1 did not violate the limits of statistical significance and multicollinearity. As such, model 1 was considered to be appropriate for estimating the precipitation when taking into account the topography. In the study of model 1, the multiple regression model using factor analysis was found to be the best method for estimating the orographic precipitation on Jeju Island.

  6. Operational modal analysis by updating autoregressive model

    Science.gov (United States)

    Vu, V. H.; Thomas, M.; Lakis, A. A.; Marcouiller, L.

    2011-04-01

    This paper presents improvements of a multivariable autoregressive (AR) model for applications in operational modal analysis considering simultaneously the temporal response data of multi-channel measurements. The parameters are estimated by using the least squares method via the implementation of the QR factorization. A new noise rate-based factor called the Noise rate Order Factor (NOF) is introduced for use in the effective selection of model order and noise rate estimation. For the selection of structural modes, an orderwise criterion called the Order Modal Assurance Criterion (OMAC) is used, based on the correlation of mode shapes computed from two successive orders. Specifically, the algorithm is updated with respect to model order from a small value to produce a cost-effective computation. Furthermore, the confidence intervals of each natural frequency, damping ratio and mode shapes are also computed and evaluated with respect to model order and noise rate. This method is thus very effective for identifying the modal parameters in case of ambient vibrations dealing with modern output-only modal analysis. Simulations and discussions on a steel plate structure are presented, and the experimental results show good agreement with the finite element analysis.

  7. Exploring Factor Model Parameters across Continuous Variables with Local Structural Equation Models.

    Science.gov (United States)

    Hildebrandt, Andrea; Lüdtke, Oliver; Robitzsch, Alexander; Sommer, Christopher; Wilhelm, Oliver

    2016-01-01

    Using an empirical data set, we investigated variation in factor model parameters across a continuous moderator variable and demonstrated three modeling approaches: multiple-group mean and covariance structure (MGMCS) analyses, local structural equation modeling (LSEM), and moderated factor analysis (MFA). We focused on how to study variation in factor model parameters as a function of continuous variables such as age, socioeconomic status, ability levels, acculturation, and so forth. Specifically, we formalized the LSEM approach in detail as compared with previous work and investigated its statistical properties with an analytical derivation and a simulation study. We also provide code for the easy implementation of LSEM. The illustration of methods was based on cross-sectional cognitive ability data from individuals ranging in age from 4 to 23 years. Variations in factor loadings across age were examined with regard to the age differentiation hypothesis. LSEM and MFA converged with respect to the conclusions. When there was a broad age range within groups and varying relations between the indicator variables and the common factor across age, MGMCS produced distorted parameter estimates. We discuss the pros of LSEM compared with MFA and recommend using the two tools as complementary approaches for investigating moderation in factor model parameters.

  8. Taking the Error Term of the Factor Model into Account: The Factor Score Predictor Interval

    Science.gov (United States)

    Beauducel, Andre

    2013-01-01

    The problem of factor score indeterminacy implies that the factor and the error scores cannot be completely disentangled in the factor model. It is therefore proposed to compute Harman's factor score predictor that contains an additive combination of factor and error variance. This additive combination is discussed in the framework of classical…

  9. A multilevel analysis of the demands-control model: Is stress at work determined by factors at the group level or the individual level?

    NARCIS (Netherlands)

    Van Yperen, N.W.; Snijders, T.A.B.

    2000-01-01

    This study explored the extent to which negative health-related outcomes are associated with differences between work groups and with differences between individuals within work groups using R. A. Karasek's (1979) demands-control model. The sample consisted of 260 employees in 31 working groups of a

  10. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  11. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  12. Advances in behavioral genetics modeling using Mplus: applications of factor mixture modeling to twin data.

    Science.gov (United States)

    Muthén, Bengt; Asparouhov, Tihomir; Rebollo, Irene

    2006-06-01

    This article discusses new latent variable techniques developed by the authors. As an illustration, a new factor mixture model is applied to the monozygotic-dizygotic twin analysis of binary items measuring alcohol-use disorder. In this model, heritability is simultaneously studied with respect to latent class membership and within-class severity dimensions. Different latent classes of individuals are allowed to have different heritability for the severity dimensions. The factor mixture approach appears to have great potential for the genetic analyses of heterogeneous populations. Generalizations for longitudinal data are also outlined.

  13. Factor Analysis on the Measuring Characteristic Parameters of the Teaching Model Effect's Primary Component%关于教学模型效应主成分测量特征参数的因子分析

    Institute of Scientific and Technical Information of China (English)

    姚聪

    2015-01-01

    为达到简化教学模型中多个群体目标参照测验的观察变量潜在维度的目的,需要探讨教学模型中关于因子与共变结构抽象概念的涵义、建立理论假设、分析运用因子分析法的条件等问题。在获得的大量测量参数中,寻找到较少的几个主要成分来解释大量观察变量背后隐含因子的意义,依据特征值大小确定因子个数,正交旋转获得新的参数后,建立因子结构最简方程式和理论学说,以满足教学的实际需求,为各种测验和量表编制提供更多的反馈信息和借鉴。%ABSTRACT:to simply theobserved variables's potential dimensions of the multiple target groups reference tests in the teaching model, we need to settle the issues of the implications between the factor and the covariance structure concept, the establishment of the theory assumption and the conditions for employing the factor analysis method. this paper utilizes several primary components to explain the implication of the implied factors in the large observed variables to establish the fundamental equation and theory of the factor structure, with the number of factors according to the size of the eigenvalues and the new parameters by orthogonal rotation. this ifnding can meet the needs of the teaching and provide more feedback information and examples for tests and scale devlopment.

  14. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  15. The application of spectral distribution of product of two random matrices in the factor analysis

    Institute of Scientific and Technical Information of China (English)

    Bai-suo JIN; Bai-qi MIAO; Wu-yi YE; Zhen-xiang WU

    2007-01-01

    In the factor analysis model with large cross-section and time-series dimensions, we propose a new method to estimate the number of factors. Specially if the idiosyncratic terms satisfy a linear time series model, the estimators of the parameters can be obtained in the time series model.The theoretical properties of the estimators are also explored. A simulation study and an empirical analysis are conducted.

  16. The application of spectral distribution of product of two random matrices in the factor analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In the factor analysis model with large cross-section and time-series dimensions,we pro- pose a new method to estimate the number of factors.Specially if the idiosyncratic terms satisfy a linear time series model,the estimators of the parameters can be obtained in the time series model. The theoretical properties of the estimators are also explored.A simulation study and an empirical analysis are conducted.

  17. Factors influencing crime rates: an econometric analysis approach

    Science.gov (United States)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  18. A multilevel analysis of the demands--control model: is stress at work determined by factors at the group level or the individual level?

    Science.gov (United States)

    Van Yperen, N W; Snijders, T A

    2000-01-01

    This study explored the extent to which negative health-related outcomes are associated with differences between work groups and with differences between individuals within work groups using R.A. Karasek's (1979) demands-control model. The sample consisted of 260 employees in 31 working groups of a national bank in the Netherlands. Results suggest that job demands and job control should be conceptualized as having both group- and individual-level foundations. Support for Karasek's demands-control model was found only when these variables were split into the 2 parts, reflecting shared perceptions and employees' subjective assessment, respectively. One of the most appealing practical implications is that absence rates among homogeneous work groups can be reduced by enhancing actual control on the job.

  19. Developing a Business Model for a Podcast Streaming Service : Case Study Analysis of Online Streaming Businesses and Identification of Success Factors

    OpenAIRE

    Schmitz, Simon

    2015-01-01

    This study examines characteristics of successful online streaming businesses and proposes with the new gained insights a scalable business model solution for a podcast streaming service. As podcasts have just recently regained popularity after ten years of existence, there is an open opportunity to capitalize on the growing market of listeners. With the emergence of new formats and high-end studio productions such as ‘Serial’, the most accessed podcast to date, the podcast industry is becomi...

  20. Multi-model Integrated Analysis and Control Strategies of Human Factors in Aviation Accidents%航空事故人为因素多模型集成分析与控制策略

    Institute of Scientific and Technical Information of China (English)

    徐璇; 王华伟; 王祥

    2016-01-01

    Human factors are leading causes of modern aviation accidents. So it is helpful to improve the level of flight safety and realize intrinsic safety by analyzing characteristics of human factors and further proposing preventive measures. In this paper,multi-model integrated analysis and control processes are put forward combining FTA and HFACS,to find direct and deep causes of the accidents and to fully rec-ognize human factors,accident mechanisms and evolutionary processes. An quantitative method is also ap-plied to identify key factors so that some targeted strategies can be put forward to prevent similar accidents caused by human factors. Besides,according to the associative hazard analysis,potential unsafe factors of the accidents are found to realize accident prevention in an active way.%人为因素是现代航空事故最主要的致因因素,分析航空事故中人为因素的特点,进一步提出预防措施,有利于提高飞行安全水平,实现本质安全。提出多模型集成的航空事故人为因素分析与控制流程,将事故树分析方法( FTA)和人因分析及分类系统( HFACS)相结合,寻找事故的直接原因和深层次原因,全面识别航空事故中的人为因素、事故机理及事故演化过程;运用定量方法找出关键因素,针对性地提出避免由人为因素导致航空事故的策略;根据关联危害性分析法,挖掘事故的潜在不安全因素,实现主动的事故预防。

  1. Factor Model Forecasts of Exchange Rates

    OpenAIRE

    Charles Engel; Nelson C. Mark; Kenneth D. West

    2012-01-01

    We construct factors from a cross section of exchange rates and use the idiosyncratic deviations from the factors to forecast. In a stylized data generating process, we show that such forecasts can be effective even if there is essentially no serial correlation in the univariate exchange rate processes. We apply the technique to a panel of bilateral U.S. dollar rates against 17 OECD countries. We forecast using factors, and using factors combined with any of fundamentals suggested by Taylor r...

  2. PENGUJIAN FAMA-FRENCH THREE-FACTOR MODEL DI INDONESIA

    Directory of Open Access Journals (Sweden)

    Damar Hardianto

    2017-03-01

    Full Text Available This study empirically examined the Fama-French three factor model of stock returnsfor Indonesia over the period 2000-2004. We found evidence for pervasive market, size, andbook-to-market factors in Indonesian stock returns. We found that cross-sectional mean returnswere explained by exposures to these three factors, and not by the market factor alone. Theempirical results were reasonably consistent with the Fama-French three factor model.

  3. A confirmatory factor analysis of the Self-Directed Learning Readiness Scale.

    Science.gov (United States)

    Williams, Brett; Brown, Ted

    2013-12-01

    The Self-Directed Learning Readiness Scale measures readiness for self-directed learning among undergraduate healthcare students. While several exploratory factor analyses and one confirmatory factor analysis have examined the psychometric properties of the Self-Directed Learning Readiness Scale, questions have been raised regarding the underlying latent constructs being measured. The objective of this study was to determine the best-fitting Self-Directed Learning Readiness Scale factorial structure among three models published in the literature. Data from the three-factor 40-item Self-Directed Learning Readiness Scale completed by 233 undergraduate paramedic students from four Australian universities (response rate of 26%) were analyzed using maximum likelihood confirmatory factor analysis. Comparison of model fit from the 40-item version was undertaken with the previously documented four-factor 36-item and three-factor 29-item Self-Directed Learning Readiness Scales. The model fit indices of the three one-factor congeneric models with maximum likelihood analysis demonstrate that the 40-item Self-Directed Learning Readiness Scale does not fit the data well. The best fitting model was the four-factor 36-item Self-Directed Learning Readiness Scale followed by the three-factor 29-item models. The confirmatory factor analysis results did not support the overall construct validity of the original 40-item Self-Directed Learning Readiness Scale. © 2013 Wiley Publishing Asia Pty Ltd.

  4. The Improvement and Affecting Factors Analysis of Steam Flooding Production Model%蒸汽驱产量模型改进及影响因素分析

    Institute of Scientific and Technical Information of China (English)

    范英才; 赵杰

    2011-01-01

    Jones' steam flooding production calculation model is improved, and according to this, oil production performance is analyzed and predicted in Qi-40 in Liaohe Oilfield.The results shows: ( 1 ) Through the historical fitting, the results predicted by Jones model are agreed with the production history well, and it can be used to foretell the trend of production performance changes; (2) The rate of oil production will increase as steam injection rate and steam quality get larger; (3) If the oil saturation enlarges, the rate of oil production will increase greatly.%改进了Jones蒸汽驱产量计算模型.据此,对辽河油田齐-40块蒸汽驱生产状态进行了分析和预测.结果表明:(1)通过历史拟合,Jones模型预测结果与生产历史吻合,可用来估计生产动态变化趋势;(2)注蒸汽速度和蒸汽干度的增加,会增加原油采油速度;(3)原始含油饱和度增加会明显地增加采油速度.

  5. Gene and protein analysis of brain derived neurotrophic factor expression in relation to neurological recovery induced by an enriched environment in a rat stroke model.

    Science.gov (United States)

    Hirata, Kenji; Kuge, Yuji; Yokota, Chiaki; Harada, Akina; Kokame, Koichi; Inoue, Hiroyasu; Kawashima, Hidekazu; Hanzawa, Hiroko; Shono, Yuji; Saji, Hideo; Minematsu, Kazuo; Tamaki, Nagara

    2011-05-20

    Although an enriched environment enhances functional recovery after ischemic stroke, the mechanism underlying this effect remains unclear. We previously reported that brain derived neurotrophic factor (BDNF) gene expression decreased in rats housed in an enriched environment for 4 weeks compared to those housed in a standard cage for the same period. To further clarify the relationship between the decrease in BDNF and functional recovery, we investigated the effects of differential 2-week housing conditions on the mRNA of BDNF and protein levels of proBDNF and mature BDNF (matBDNF). After transient occlusion of the right middle cerebral artery of male Sprague-Dawley rats, we divided the rats into two groups: (1) an enriched group housed multiply in large cages equipped with toys, and (2) a standard group housed alone in small cages without toys. Behavioral tests before and after 2-week differential housing showed better neurological recovery in the enriched group than in the standard group. Synaptophysin immunostaining demonstrated that the density of synapses in the peri-infarct area was increased in the enriched group compared to the standard group, while infarct volumes were not significantly different. Real-time reverse transcription polymerase chain reaction, Western blotting and immunostaining all revealed no significant difference between the groups. The present results suggest that functional recovery cannot be ascribed to an increase in matBDNF or a decrease in proBDNF but rather to other underlying mechanisms.

  6. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the mo

  7. Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items

    Science.gov (United States)

    Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.

    2016-01-01

    This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…

  8. Phenotypic factor analysis of psychopathology reveals a new body-related transdiagnostic factor.

    Science.gov (United States)

    Pezzoli, Patrizia; Antfolk, Jan; Santtila, Pekka

    2017-01-01

    Comorbidity challenges the notion of mental disorders as discrete categories. An increasing body of literature shows that symptoms cut across traditional diagnostic boundaries and interact in shaping the latent structure of psychopathology. Using exploratory and confirmatory factor analysis, we reveal the latent sources of covariation among nine measures of psychopathological functioning in a population-based sample of 13024 Finnish twins and their siblings. By implementing unidimensional, multidimensional, second-order, and bifactor models, we illustrate the relationships between observed variables, specific, and general latent factors. We also provide the first investigation to date of measurement invariance of the bifactor model of psychopathology across gender and age groups. Our main result is the identification of a distinct "Body" factor, alongside the previously identified Internalizing and Externalizing factors. We also report relevant cross-disorder associations, especially between body-related psychopathology and trait anger, as well as substantial sex and age differences in observed and latent means. The findings expand the meta-structure of psychopathology, with implications for empirical and clinical practice, and demonstrate shared mechanisms underlying attitudes towards nutrition, self-image, sexuality and anger, with gender- and age-specific features.

  9. Accelerated Gibbs Sampling for Infinite Sparse Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Andrzejewski, D M

    2011-09-12

    The Indian Buffet Process (IBP) gives a probabilistic model of sparse binary matrices with an unbounded number of columns. This construct can be used, for example, to model a fixed numer of observed data points (rows) associated with an unknown number of latent features (columns). Markov Chain Monte Carlo (MCMC) methods are often used for IBP inference, and in this technical note, we provide a detailed review of the derivations of collapsed and accelerated Gibbs samplers for the linear-Gaussian infinite latent feature model. We also discuss and explain update equations for hyperparameter resampling in a 'full Bayesian' treatment and present a novel slice sampler capable of extending the accelerated Gibbs sampler to the case of infinite sparse factor analysis by allowing the use of real-valued latent features.

  10. Using a knowledge elicitation method to specify the business model of a human factors organization

    NARCIS (Netherlands)

    Schraagen, J.M.C.; Ven, J. van de; Hoffman, R.R.; Moon, B.M.

    2009-01-01

    Concept Mapping was used to structure knowledge elicitation interviews with a group of human factors specialists whose goal was to describe the business model of their Department. This novel use of cognitive task analysis to describe the business model of a human factors organization resulted in a n

  11. Using a knowledge elicitation method to specify the business model of a human factors organization

    NARCIS (Netherlands)

    Schraagen, J.M.C.; Ven, J. van de; Hoffman, R.R.; Moon, B.M.

    2009-01-01

    Concept Mapping was used to structure knowledge elicitation interviews with a group of human factors specialists whose goal was to describe the business model of their Department. This novel use of cognitive task analysis to describe the business model of a human factors organization resulted in a n

  12. Using a knowledge elicitation method to specify the business model of a human factors organization.

    NARCIS (Netherlands)

    Schraagen, Johannes Martinus Cornelis; van de Ven, Josine; Hoffman, Robert R.; Moon, Brian M.

    2009-01-01

    Concept Mapping was used to structure knowledge elicitation interviews with a group of human factors specialists whose goal was to describe the business model of their Department. This novel use of cognitive task analysis to describe the business model of a human factors organization resulted in a n

  13. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    David B. Flora

    2012-03-01

    Full Text Available We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  14. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis.

    Science.gov (United States)

    Flora, David B; Labrish, Cathy; Chalmers, R Philip

    2012-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  15. Assessing the Dimensionality of the GMAT Verbal and Quantitative Measures Using Full Information Factor Analysis.

    Science.gov (United States)

    Kingston, Neal

    When the three-parameter logistic model and item response theory are used to analyze Graduate Management Admission Test (GMAT) data, there are problems with the assumption of unidimensionality. Linear factor analytic models, exploratory factor analysis programs, and the comparison of item parameter estimates for heterogeneous and homogeneous…

  16. Confirmatory factor analysis of the Child Oral Health Impact Profile (Korean version).

    Science.gov (United States)

    Cho, Young Il; Lee, Soonmook; Patton, Lauren L; Kim, Hae-Young

    2016-04-01

    Empirical support for the factor structure of the Child Oral Health Impact Profile (COHIP) has not been fully established. The purposes of this study were to evaluate the factor structure of the Korean version of the COHIP (COHIP-K) empirically using confirmatory factor analysis (CFA) based on the theoretical framework and then to assess whether any of the factors in the structure could be grouped into a simpler single second-order factor. Data were collected through self-reported COHIP-K responses from a representative community sample of 2,236 Korean children, 8-15 yr of age. Because a large inter-factor correlation of 0.92 was estimated in the original five-factor structure, the two strongly correlated factors were combined into one factor, resulting in a four-factor structure. The revised four-factor model showed a reasonable fit with appropriate inter-factor correlations. Additionally, the second-order model with four sub-factors was reasonable with sufficient fit and showed equal fit to the revised four-factor model. A cross-validation procedure confirmed the appropriateness of the findings. Our analysis empirically supported a four-factor structure of COHIP-K, a summarized second-order model, and the use of an integrated summary COHIP score.

  17. Comparison of Statistical Models for Regional Crop Trial Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qun-yuan; KONG Fan-ling

    2002-01-01

    Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model > AMMI model > PCA model > Treatment Means (TM) model > Linear Regression (LR) model > Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.

  18. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  19. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  20. Liquidity and Fama-French Three-Factor Model

    Institute of Scientific and Technical Information of China (English)

    陈政

    2012-01-01

      The Fama-French three-factor model was proposed to explain the expected return. In this paper,the author takes advantage of the recent data from NYSE, AMEX and NASDAQ stocks to examine whether the Fama-French three-factor model can explain the expected return well on the basis of reviewing the importance of liquidity and criticizing the Fama-French three-factor model. It turns out that the three-factor model can still reflect the factor in asset pricing to a certain degree.

  1. Analysis of Influencing Factors on South Korea Tourists Demands for China Based on Double Logarithmic Model%基于双对数模型的人境韩国游客旅游需求影响因素分析

    Institute of Scientific and Technical Information of China (English)

    赵陶钧; 杨丽琼; 和亚君

    2012-01-01

    South Korea is one of the main tourist countries in China' s inbound market, and therefore it is of important significance to research influencing factors of South Korea tourist demand for China. By using the double logarithmic model and SPSS20 tool to analyse the data, this article makes a conclusion that the ratio of consumer price index (CPI) of both South Korea and China, per capita GDP of South Korea and SARS event are the main factors which influence the South Korea tourist demands for China, while the exchange rate of Won against RMB, the finance crisis of Southeast Asia and the finance crisis of the world don' t have the significant impacts on demand. Among these factors, the ratio of CPI of both South Korea and China, per capita GDP of South Korea and exchange rate of Won against RMB have positive correlation with South Korea tourist demand for China, while other factors have negative correlation with it. During the analysis, the article puts forward the view that a factor With insignificant ' t' test should not be removed from the model under certain conditions, and also explores the methods of choosing data.%借助双对数模型,使用SPSS20软件得出了对韩国游客旅华需求影响的主要因素有韩国与中国相对CPI、韩国人均GDP以及非典事件,而韩元兑人民币汇率、亚洲金融危机、全球金融危机对需求的影响不显著,其中对需求的作用为正相关的因素有韩国与中国相对CPI、韩国人均GDP以及韩元兑人民币汇率,其余因素表现为负相关作用。分析过程中提出了在符合一定条件的前提下不应将没有通过t检验的因素从模型中剔除的看法,同时对数据的选择进行了探索。

  2. Correlations of MMPI factor scales with measures of the five factor model of personality.

    Science.gov (United States)

    Costa, P T; Busch, C M; Zonderman, A B; McCrae, R R

    1986-01-01

    Two recent item factor analyses of the Minnesota Multiphasic Personality Inventory (MMPI) classified the resulting factors according to a conceptual scheme offered by Norman's (1963) five factor model. The present article empirically evaluates those classifications by correlating MMPI factor scales with self-report and peer rating measures of the five factor model in a sample of 153 adult men and women. Both sets of predictions were generally supported, although MMPI factors derived in a normal sample showed closer correspondences with the five normal personality dimensions. MMPI factor scales were also correlated with 18 scales measuring specific traits within the broader domains of Neuroticism, Extraversion, and Openness. The nine Costa, Zonderman, McCrae, and Williams (1985) MMPI factor scales appear to give useful global assessments of four of the five factors; other instruments are needed to provide detailed information on more specific aspects of normal personality. The use of the five factor model in routine clinical assessment is discussed.

  3. A new method for simultaneous estimation of the factor model parameters, factor scores, and unique parts

    NARCIS (Netherlands)

    Stegeman, Alwin

    2016-01-01

    In the common factor model the observed data is conceptually split into a common covariance producing part and an uncorrelated unique part. The common factor model is fitted to the data itself and a new method is introduced for the simultaneous estimation of loadings, unique variances, factor scores

  4. A new method for simultaneous estimation of the factor model parameters, factor scores, and unique parts

    NARCIS (Netherlands)

    Stegeman, Alwin

    In the common factor model the observed data is conceptually split into a common covariance producing part and an uncorrelated unique part. The common factor model is fitted to the data itself and a new method is introduced for the simultaneous estimation of loadings, unique variances, factor

  5. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    Science.gov (United States)

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  6. An Item Factor Analysis of the Mooney Problem Check List

    Science.gov (United States)

    Stewart, David W.; Deiker, Thomas

    1976-01-01

    Explores the factor structure of the Mooney Problem Check List (MPCL) at the junior and senior high school level by undertaking a large obverse factor analysis of item responses in three adolescent criterion groups. (Author/DEP)

  7. Flavor Analysis of Nucleon, Δ , and Hyperon Electromagnetic Form Factors

    Science.gov (United States)

    Rohrmoser, Martin; Choi, Ki-Seok; Plessas, Willibald

    2017-03-01

    By the analysis of the world data base of elastic electron scattering on the proton and the neutron (for the latter, in fact, on ^2H and ^3He) important experimental insights have recently been gained into the flavor compositions of nucleon electromagnetic form factors. We report on testing the Graz Goldstone-boson-exchange relativistic constituent-quark model in comparison to the flavor contents in low-energy nucleons, as revealed from electron-scattering phenomenology. It is found that a satisfactory agreement is achieved between theory and experiment for momentum transfers up to Q^2˜ 4 GeV^2, relying on three-quark configurations only. Analogous studies have been extended to the Δ and the hyperon electromagnetic form factors. For them we here show only some sample results in comparison to data from lattice quantum chromodynamics.

  8. 应用因子分析和K-MEANS聚类的客户分群建模%Customer Segmentation Modeling on Factor Analysis and K-MEANS Clustering

    Institute of Scientific and Technical Information of China (English)

    彭凯; 秦永彬; 许道云

    2011-01-01

    为挖掘存量用户的潜在数据业务使用需求,研究客户细分成为各电信运营商进行差异化营销所必须解决的问题.利用聚类算法提出了一种解决电信短信业务客户分群的应用模型.首先基于因子分析为复杂参数变量下的数据挖掘有效地减少了冗余字段,提高了模型构建的质量和效率,然后通过无监督的K-MEANS分群算法完成分群.经验证,该短信分群模型具备明显的特征差异性.2009年某西部通信企业应用该模型在数据业务差异化营销中取得了明显的效益.%To develop customers' potential demands for data services, the research for customer segmentation has become a primitive work of telecommunications operators in order to run a differentiated users' marketing. Through the use of clustering algorithm, this paper presented a segmentation modeling for differentiating customers using short messaging services in telecommunications operators. Firstly, based on factor analysis, redundant properties were simplified in the complex data mining under variable parameters in order to improve the quality and efficiency of the modeling, and then the customer segmentation model was constructed through unsupervised clustering K-MEANS algorithm. It was verified that the SMS users have the obvious differentiation of characteristics by using the cluster model. In 2009, a western communications enterprise achieved significant benefits with application of the model in the differentiated data service marketing.

  9. Exploratory matrix factorization for PET image analysis.

    Science.gov (United States)

    Kodewitz, A; Keck, I R; Tomé, A M; Lang, E W

    2010-01-01

    Features are extracted from PET images employing exploratory matrix factorization techniques such as nonnegative matrix factorization (NMF). Appropriate features are fed into classifiers such as a support vector machine or a random forest tree classifier. An automatic feature extraction and classification is achieved with high classification rate which is robust and reliable and can help in an early diagnosis of Alzheimer's disease.

  10. Analysis of Factors Affecting the Quality of an E-commerce Website Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Saurabh Mishra

    2014-12-01

    Full Text Available The purpose of this study is to identify factors which affect the quality and effectiveness of an e commerce website which also majorly affect customer satisfaction and ultimately customer retention and loyalty. This research paper examines a set of 23 variables and integrates them into 4 factors which affect the quality of a website. An online questionnaire survey was conducted to generate statistics regarding the preferences of the e-commerce website users.The 23 variables taken from customer survey are generalized into 4 major factors using exploratory factor analysis which are content, navigation, services and interface design. The research majorly consists of the responses of students between the age group of 18-25 years and considers different B2C commercial websites. Identified variables are important with respect to the current competition in the market as service of an e-commerce website also play a major role in ensuring customer satisfaction. Further research in this domain can be done for websites’ version for mobile devices.

  11. The Butterfly Effect: Correlations Between Modeling in Nuclear-Particle Physics and Socioeconomic Factors

    CERN Document Server

    Pia, Maria Grazia; Bell, Zane W.; Dressendorfer, Paul V.

    2010-01-01

    A scientometric analysis has been performed on selected physics journals to estimate the presence of simulation and modeling in physics literature in the past fifty years. Correlations between the observed trends and several social and economical factors have been evaluated.

  12. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Steponas Jonušauskas; Agota Giedre Raisiene

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  13. Identification of noise in linear data sets by factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roscoe, B.A.; Hopke, Ph.K. (Illinois Univ., Urbana (USA))

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors.

  14. 商业医疗保险损失分析:基于广义线性模型的实证研究%Risk Factors of Losses of Commercial Medical Insurance: An Empirical Analysis Using Generalized Linear Models

    Institute of Scientific and Technical Information of China (English)

    仇春涓; 陈滔

    2012-01-01

    本文使用广义线性模型对商业医疗保险损失进行建模,并用某商业保险公司的医疗保险赔付数据进行了实证检验,结果表明,在影响医疗保险损失的诸多因素中,住院天数、医院级别、地区、保障档次等都是显著的因素,而性别和小于60岁以下年龄段内年龄则并不是显著因素,这些结论给医疗保险的经营和风险控制带来实际的意义.%The risk factors of commercial medical insurance losses are investigated in this paper. We conduct an empirical analysis by fitting the Gamma generalized linear model to a commercial medical insurance's claims data. The result indicates that among the many candidate risk factors of medical insurance losses, the days of hospital stay, the hospital levels, the area where the insurance business is applied and the insurance level are significant. On the contrary, the gender and age are insignificant. Finally, some suggestions are presented, which we believe to be helpful for future medical insurance's operations and management.

  15. [Cultural regionalization for Coptis chinensis based on 3S technology platform Ⅰ. Study on growth suitability for Coptis chinensis based on ecological factors analysis by Maxent and ArcGIS model].

    Science.gov (United States)

    Liu, Xin; Yang, Yan-Fang; Song, Hong-Ping; Zhang, Xiao-Bo; Huang, Lu-Qi; Wu, He-Zhen

    2016-09-01

    At the urgent request of Coptis chinensis planting,growth suitability as assessment indicators for C. chinensis cultivation was proposed and analyzed in this paper , based on chemical quality determination and ecological fators analysis by Maxent and ArcGIS model. Its potential distribution areas at differernt suitability grade and regionalization map were formulated based on statistical theory and growth suitability theory. The results showed that the most suitable habitats is some parts of Chongqing and Hubei province, such as Shizhu, Lichuan, Wulong, Wuxi, Enshi. There are seven ecological factor is the main ecological factors affect the growth of Coptidis Rhizoma, including altitude, precipitation in February and September and the rise of precipitation and altitude is conducive to the accumulation of total alkaloid content in C. chinensis. Therefore, The results of the study not only illustrates the most suitable for the surroundings of Coptidis Rhizoma, also helpful to further research and practice of cultivation regionalization, wild resource monitoring and large-scale cultivation of traditional Chinese medicine plants. Copyright© by the Chinese Pharmaceutical Association.

  16. Effect Factors of Liquid Scintillation Analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>Over the past decades, the liquid scintillation analysis (LSA) technique remains one of the most popular experimental tools used for the quantitative analysis of radionuclide, especially low-energy β

  17. Heteroscedastic one-factor models and marginal maximum likelihood estimation

    NARCIS (Netherlands)

    Hessen, D.J.; Dolan, C.V.

    2009-01-01

    In the present paper, a general class of heteroscedastic one-factor models is considered. In these models, the residual variances of the observed scores are explicitly modelled as parametric functions of the one-dimensional factor score. A marginal maximum likelihood procedure for parameter estimati

  18. Stochastic Analysis Method of Sea Environment Simulated by Numerical Models

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 焦桂英; 张明霞; 温书勤

    2003-01-01

    This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.

  19. Learning From Hidden Traits: Joint Factor Analysis and Latent Clustering

    Science.gov (United States)

    Yang, Bo; Fu, Xiao; Sidiropoulos, Nicholas D.

    2017-01-01

    Dimensionality reduction techniques play an essential role in data analytics, signal processing and machine learning. Dimensionality reduction is usually performed in a preprocessing stage that is separate from subsequent data analysis, such as clustering or classification. Finding reduced-dimension representations that are well-suited for the intended task is more appealing. This paper proposes a joint factor analysis and latent clustering framework, which aims at learning cluster-aware low-dimensional representations of matrix and tensor data. The proposed approach leverages matrix and tensor factorization models that produce essentially unique latent representations of the data to unravel latent cluster structure -- which is otherwise obscured because of the freedom to apply an oblique transformation in latent space. At the same time, latent cluster structure is used as prior information to enhance the performance of factorization. Specific contributions include several custom-built problem formulations, corresponding algorithms, and discussion of associated convergence properties. Besides extensive simulations, real-world datasets such as Reuters document data and MNIST image data are also employed to showcase the effectiveness of the proposed approaches.

  20. IDENTIFICATION OF CRITICAL SSCM ACTIVITIES THROUGH CONFIRMATORY FACTOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. Narasimham

    2013-06-01

    Full Text Available As a developing country, economic and environmental performance has to be balanced in India. Green supply chain management (GSCM is emerging as an important proactive approach for Indian enterprises for improving environmental performance of processes and products in accordance with the requirements of environmental regulations. This study examines the consistency approaches by confirmatory factor analysis that determines the construct validity, convergent validity,construct reliability and internal consistency of the items of Sustainable supply chain management (SSCM requirements. This study examines the consistency approaches by Confirmatory factor analysis that determines the adoption and implementation of Sustainable supply chain management activities in small & medium scale industries. The requirements include Management commitment, customer coordination, sustainable design & production, green procurement and eco logistics for sustainable supply chains. This study suggested that the five factor model with eighteen items of the sustainable supply chain design had a good fit. Further, the study showed a valid and reliable measurement to identify critical items among the requirements of sustainable supply chains.

  1. Using Multilevel Factor Analysis with Clustered Data: Investigating the Factor Structure of the Positive Values Scale

    Science.gov (United States)

    Huang, Francis L.; Cornell, Dewey G.

    2016-01-01

    Advances in multilevel modeling techniques now make it possible to investigate the psychometric properties of instruments using clustered data. Factor models that overlook the clustering effect can lead to underestimated standard errors, incorrect parameter estimates, and model fit indices. In addition, factor structures may differ depending on…

  2. A structural dynamic factor model for the effects of monetary policy estimated by the EM algorithm

    DEFF Research Database (Denmark)

    Bork, Lasse

    This paper applies the maximum likelihood based EM algorithm to a large-dimensional factor analysis of US monetary policy. Specifically, economy-wide effects of shocks to the US federal funds rate are estimated in a structural dynamic factor model in which 100+ US macroeconomic and financial time...... series are driven by the joint dynamics of the federal funds rate and a few correlated dynamic factors. This paper contains a number of methodological contributions to the existing literature on data-rich monetary policy analysis. Firstly, the identification scheme allows for correlated factor dynamics...... as opposed to the orthogonal factors resulting from the popular principal component approach to structural factor models. Correlated factors are economically more sensible and important for a richer monetary policy transmission mechanism. Secondly, I consider both static factor loadings as well as dynamic...

  3. A structural dynamic factor model for the effects of monetary policy estimated by the EM algorithm

    DEFF Research Database (Denmark)

    Bork, Lasse

    This paper applies the maximum likelihood based EM algorithm to a large-dimensional factor analysis of US monetary policy. Specifically, economy-wide effects of shocks to the US federal funds rate are estimated in a structural dynamic factor model in which 100+ US macroeconomic and financial time...... series are driven by the joint dynamics of the federal funds rate and a few correlated dynamic factors. This paper contains a number of methodological contributions to the existing literature on data-rich monetary policy analysis. Firstly, the identification scheme allows for correlated factor dynamics...... as opposed to the orthogonal factors resulting from the popular principal component approach to structural factor models. Correlated factors are economically more sensible and important for a richer monetary policy transmission mechanism. Secondly, I consider both static factor loadings as well as dynamic...

  4. Model selection for amplitude analysis

    CERN Document Server

    Guegan, Baptiste; Stevens, Justin; Williams, Mike

    2015-01-01

    Model complexity in amplitude analyses is often a priori under-constrained since the underlying theory permits a large number of amplitudes to contribute to most physical processes. The use of an overly complex model results in reduced predictive power and worse resolution on unknown parameters of interest. Therefore, it is common to reduce the complexity by removing from consideration some subset of the allowed amplitudes. This paper studies a data-driven method for limiting model complexity through regularization during regression in the context of a multivariate (Dalitz-plot) analysis. The regularization technique applied greatly improves the performance. A method is also proposed for obtaining the significance of a resonance in a multivariate amplitude analysis.

  5. FACTOR ANALYSIS OF THE ROLLING PROCESS TECHNOLOGY (Part2

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2011-01-01

    Full Text Available The mathematical model of the multivariate regression analysis carrying out is presented. Its practical application for rolled production is examined. Analysis of the model adequacy is carried out.

  6. Factor Model Forecasting of Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Davor Kunovac

    2007-12-01

    Full Text Available This paper tests whether information derived from 144 economic variables (represented by only a few constructed factors can be used for the forecasting of consumer prices in Croatia. The results obtained show that the use of one factor enhances the precision of the benchmark model’s ability to forecast inflation. The methodology used is sufficiently general to be able to be applied directly for the forecasting of other economic variables.

  7. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  8. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  9. Supervision in Factor Models Using a Large Number of Predictors

    DEFF Research Database (Denmark)

    Boldrini, Lorenzo; Hillebrand, Eric Tobias

    In this paper we investigate the forecasting performance of a particular factor model (FM) in which the factors are extracted from a large number of predictors. We use a semi-parametric state-space representation of the FM in which the forecast objective, as well as the factors, is included.......g. a standard dynamic factor model with separate forecast and state equations....... in the state vector. The factors are informed of the forecast target (supervised) through the state equation dynamics. We propose a way to assess the contribution of the forecast objective on the extracted factors that exploits the Kalman filter recursions. We forecast one target at a time based...

  10. A model for equivalent axle load factors

    OpenAIRE

    Amorim, Sara I.R.; Pais, Jorge; Vale, Aline C.; Minhoto, Manuel

    2014-01-01

    Most design methods for road pavements require the design traffic, based on the transformation of the traffic spectrum, to be calculated into a number of equivalent passages of a standard axle using the equivalent axle load factors. Generally, these factors only consider the type of axle (single, tandem or tridem), but they do not consider the type of wheel on the axles, i.e., single or dual wheel. The type of wheel has an important influence on the calculation of the design traffic. The exis...

  11. A Dynamic Multi-Level Factor Model with Long-Range Dependence

    DEFF Research Database (Denmark)

    Ergemen, Yunus Emre; Rodríguez-Caballero, Carlos Vladimir

    A dynamic multi-level factor model with stationary or nonstationary global and regional factors is proposed. In the model, persistence in global and regional common factors as well as innovations allows for the study of fractional cointegrating relationships. Estimation of global and regional...... is then applied to the Nord Pool power market for the analysis of price comovements among different regions within the power grid. We find that the global factor can be interpreted as the system price of the power grid as well as a fractional cointegration relationship between prices and the global factor....

  12. FACTOR ANALYSIS OF THE ELKINS HYPNOTIZABILITY SCALE

    Science.gov (United States)

    Elkins, Gary; Johnson, Aimee K.; Johnson, Alisa J.; Sliwinski, Jim

    2015-01-01

    Assessment of hypnotizability can provide important information for hypnosis research and practice. The Elkins Hypnotizability Scale (EHS) consists of 12 items and was developed to provide a time-efficient measure for use in both clinical and laboratory settings. The EHS has been shown to be a reliable measure with support for convergent validity with the Stanford Hypnotic Susceptibility Scale, Form C (r = .821, p < .001). The current study examined the factor structure of the EHS, which was administered to 252 adults (51.3% male; 48.7% female). Average time of administration was 25.8 minutes. Four factors selected on the basis of the best theoretical fit accounted for 63.37% of the variance. The results of this study provide an initial factor structure for the EHS. PMID:25978085

  13. Economic modeling and sensitivity analysis.

    Science.gov (United States)

    Hay, J W

    1998-09-01

    The field of pharmacoeconomics (PE) faces serious concerns of research credibility and bias. The failure of researchers to reproduce similar results in similar settings, the inappropriate use of clinical data in economic models, the lack of transparency, and the inability of readers to make meaningful comparisons across published studies have greatly contributed to skepticism about the validity, reliability, and relevance of these studies to healthcare decision-makers. Using a case study in the field of lipid PE, two suggestions are presented for generally applicable reporting standards that will improve the credibility of PE. Health economists and researchers should be expected to provide either the software used to create their PE model or a multivariate sensitivity analysis of their PE model. Software distribution would allow other users to validate the assumptions and calculations of a particular model and apply it to their own circumstances. Multivariate sensitivity analysis can also be used to present results in a consistent and meaningful way that will facilitate comparisons across the PE literature. Using these methods, broader acceptance and application of PE results by policy-makers would become possible. To reduce the uncertainty about what is being accomplished with PE studies, it is recommended that these guidelines become requirements of both scientific journals and healthcare plan decision-makers. The standardization of economic modeling in this manner will increase the acceptability of pharmacoeconomics as a practical, real-world science.

  14. Degenerate solutions obtained from several variants of factor analysis

    NARCIS (Netherlands)

    Zijlstra, Bonne J.H.; Kiers, Henk A.L.

    2002-01-01

    Considerable research has been performed concerning degenerate solutions from the Parafac model. However, degenerate solutions have also been reported to occur with the shifted multiplicative model and a model for component analysis of multitrait multimethod matrices. Furthermore, we obtained

  15. ANALYSIS OF EXTERNAL FACTORS AFFECTING THE PRICING

    Directory of Open Access Journals (Sweden)

    Irina A. Kiseleva

    2013-01-01

    Full Text Available The external factors influencing the process of formation of tariffs of commercial services are considered in the article. External environment is known to be very diverse and changeable. Currently, pricing has become one of the key processes of strategic development of a company. Pricing in the service sector, in turn, is highly susceptible to changes in the external environment. Its components directly or indirectly affect the market of services, changing it adopted economic processes. As a rule, firms providing services can’t influence the changes in external factors. However, the service market is very flexible, which enables businesses to reshape pricing strategy, to adapt it to the new environment.

  16. Adaptation of a 3-factor model for the Pittsburgh Sleep Quality Index in Portuguese older adults.

    Science.gov (United States)

    Becker, Nathália Brandolim; de Neves Jesus, Saul

    2017-05-01

    The present study examined the factor structure of the Pittsburgh Sleep Quality Index (PSQI) in a sample of older Portuguese adults using a cross-validation approach. Design is a cross-sectional. A convenience sample of 204 community-dwelling older adults (M=70.05, SD=7.15) were included. The global sleep quality (GSQ) score ranged from 0 to 18 with a mean of 5.98 (SD±3.45). The distribution showed that gender and perception of oneself as healthy influences GSQ in this sample. Cronbach's α was 0.69, but increased to 0.70 if the "use of sleep medication" component was deleted. Exploratory factor analysis (EFA) demonstrated two factor model is better than one factor, and a model fit with good indices (chi-square=8.649, df=8, p=0.373). Confirmatory factor analysis (CFA) was performed on the single factor, two factor, and three factor models, with and without the "use of sleep medications" component. The best model was the 3-factor model without the "use of sleep medications" component (chi-square=1.214, df=6, GFI=0.997, AGFI=0.918, CFI=0.986, RMSEA=0.046). The adaptation of the model is similar to the original model, with the only change being the exclusion of the "use of medications to sleep" component. We suggest using that component as a complementary qualitative assessment of health. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  17. Confirmatory factor analysis of the Oral Health Impact Profile.

    Science.gov (United States)

    John, M T; Feuerstahler, L; Waller, N; Baba, K; Larsson, P; Celebić, A; Kende, D; Rener-Sitar, K; Reissmann, D R

    2014-09-01

    Previous exploratory analyses suggest that the Oral Health Impact Profile (OHIP) consists of four correlated dimensions and that individual differences in OHIP total scores reflect an underlying higher-order factor. The aim of this report is to corroborate these findings in the Dimensions of Oral Health-Related Quality of Life (DOQ) Project, an international study of general population subjects and prosthodontic patients. Using the project's Validation Sample (n = 5022), we conducted confirmatory factor analyses in a sample of 4993 subjects with sufficiently complete data. In particular, we compared the psychometric performance of three models: a unidimensional model, a four-factor model and a bifactor model that included one general factor and four group factors. Using model-fit criteria and factor interpretability as guides, the four-factor model was deemed best in terms of strong item loadings, model fit (RMSEA = 0·05, CFI = 0·99) and interpretability. These results corroborate our previous findings that four highly correlated factors - which we have named Oral Function, Oro-facial Pain, Oro-facial Appearance and Psychosocial Impact - can be reliably extracted from the OHIP item pool. However, the good fit of the unidimensional model and the high interfactor correlations in the four-factor solution suggest that OHRQoL can also be sufficiently described with one score.

  18. Factor Analysis for Spectral Reconnaissance and Situational Understanding

    Science.gov (United States)

    2016-07-11

    reviewed journals: Final Report: Factor Analysis for Spectral Reconnaissance and Situational Understanding Report Title The Army has a critical need for...based NP-hard design problems, by associating them with corresponding estimation problems. 1 Factor Analysis for Spectral Reconnaissance and Situational ...SECURITY CLASSIFICATION OF: The Army has a critical need for enhancing situational understanding for dismounted soldiers and rapidly deployed tactical

  19. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  20. Exploratory Factor Analysis of African Self-Consciousness Scale Scores

    Science.gov (United States)

    Bhagwat, Ranjit; Kelly, Shalonda; Lambert, Michael C.

    2012-01-01

    This study replicates and extends prior studies of the dimensionality, convergent, and external validity of African Self-Consciousness Scale scores with appropriate exploratory factor analysis methods and a large gender balanced sample (N = 348). Viable one- and two-factor solutions were cross-validated. Both first factors overlapped significantly…

  1. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  2. Chou-Yang model and PHI form factor

    Energy Technology Data Exchange (ETDEWEB)

    Fazal-e-Aleem; Saleem, M.; Rafique, M.

    1988-03-01

    By using the deduced differential cross-section data for PHIp elastic scattering at 175 GeV/c in the Chou-Yang model, the PHI form factor has been computed and parametrized. Then in conjunction with the proton form factor this form factor is used in the pristine Chou-Yang model to obtain differential cross-section data at Fermilab energies. The theoretical results agree with the experimental measurements, endorsing the conjecture that the hadronic form factor of neutral particle is proportional to its magnetic form factor.

  3. Timing analysis by model checking

    Science.gov (United States)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  4. A Factor Analysis for Time Series.

    Science.gov (United States)

    1984-07-01

    34Some results on multivariate autoregressive index models". Biometrika, 70, 145-156. . [11] Sanchez-Albornoz, N. (1975). Los precios agricolas durante...la segunda mitad del siglo XIX. Banco de Espana. [121 Sargent, T. J. and Sims, G. A. (1977). " Business cycle modeling without pretending to have too...much a priori economic theory", in New Methods in Business Cycle Research: Proceeding from a Conference, ed. C. A. Sims, Minneapolis, MNI Federal

  5. Logistic regression for risk factor modelling in stuttering research.

    Science.gov (United States)

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Analysis of Interaction Factors Between Two Piles

    Institute of Scientific and Technical Information of China (English)

    CAO Ming; CHEN Long-zhu

    2008-01-01

    A rigorous analytical method is presented for calculating the interaction factor between two identical piles subjected to vertical loads. Following the technique proposed by Muki and Sternberg, the problem is decomposed into an extended soil mass and two fictitious piles characterized respectively by Young's modulus of the soil and that of the difference between the pile and soil. The unknown axial forces along fictitious piles are determined by solving a Fredholm integral equation of the second kind, which imposes the compatibility condition that the axial strains of the fictitious piles are equal to those corresponding to the centroidal axes of the extended soil. The real pile forces and displacements can subequally be calculated based on the determined fictitious pile forces, and finally, the desired pile interaction factors may be obtained. Results confirm the validity of the proposed approach and portray the influence of the governing parameters on the pile interaction.

  7. Analysis of factors affecting fattening of chickens

    OpenAIRE

    OBERMAJEROVÁ, Barbora

    2013-01-01

    Poultry meat belongs to the basic assortment of human nutrition. The meat of an intensively fattened poultry is a source of easily digestible proteins, lipids, mineral substances and vitamins. The aim of this bachelor´s thesis was to write out a literature review, which is focused on the intensity of growth, carcass yield, quality and composition of broiler chickens meat. The following describes the internal and external factors that affect them, i.e. genetic foundation, hybrid combination, s...

  8. Study on neural network model for calculating subsidence factor

    Institute of Scientific and Technical Information of China (English)

    GUO Wen-bing; ZHANG Jie

    2007-01-01

    The major factors influencing subsidence factor were comprehensively analyzed. Then the artificial neural network model for calculating subsidence factor was set up with the theory of artificial neural network (ANN). A large amount of data from observation stations in China was collected and used as learning and training samples to train and test the artificial neural network model. The calculated results of the ANN model and the observed values were compared and analyzed in this paper. The results demonstrate that many factors can be considered in this model and the result is more precise and closer to observed values to calculate the subsidence factor by the ANN model. It can satisfy the need of engineering.

  9. Confirmatory Factor Analysis of the ISB - Burnout Syndrome Inventory

    Directory of Open Access Journals (Sweden)

    Ana Maria T. Benevides-Pereira

    2017-05-01

    Full Text Available AimBurnout is a dysfunctional reaction to chronic occupational stress. The present study analysis the psychometric qualities of the Burnout Syndrome Inventory (ISB through Confirmatory Factor Analysis (CFA.MethodEmpirical study in a multi-centre and multi-occupational sample (n = 701 using the ISB. The Part I assesses antecedent factors: Positive Organizational Conditions (PC and Negative Organizational Conditions (NC. The Part II assesses the syndrome: Emotional Exhaustion (EE, Dehumanization (DE, Emotional Distancing (ED and Personal Accomplishment (PA.ResultsThe highest means occurred in the positive scales CP (M = 23.29, SD = 5.89 and PA (M = 14.84, SD = 4.71. Negative conditions showed the greatest variability (SD = 6.03. Reliability indexes were reasonable, with the lowest rate at .77 for DE and the highest rate .91 for PA. The CFA revealed RMSEA = .057 and CFI = .90 with all scales regressions showing significant values (β = .73 until β = .92.ConclusionThe ISB showed a plausible instrument to evaluate burnout. The two sectors maintained the initial model and confirmed the theoretical presupposition. This instrument makes possible a more comprehensive idea of the labour context, and one or another part may be used separately according to the needs and the aims of the assessor.

  10. Difficult mask ventilation in obese patients: analysis of predictive factors.

    Science.gov (United States)

    Leoni, A; Arlati, S; Ghisi, D; Verwej, M; Lugani, D; Ghisi, P; Cappelleri, G; Cedrati, V; El Tantawi Ali Alsheraei, A; Pocar, M; Ceriani, V; Aldegheri, G

    2014-02-01

    This study aimed to determine the accuracy of commonly used preoperative difficult airway indices as predictors of difficult mask ventilation (DMV) in obese patients (BMI >30 kg/m2). In 309 consecutive obese patients undergoing general surgery, the modified Mallampati test, patient's Height/Thyromental distance ratio, Inter-Incisor Distance, Protruding Mandible (PM), history of Obstructive Sleep Apnea and Neck Circumference (NC) were recorded preoperatively. DMV was defined as Grade 3 mask ventilation (MV) by the Han's scale (MV inadequate, unstable or requiring two practitioners). Data are shown as means±SD or number and proportions. Independent DMV predictors were identified by multivariate analysis. The discriminating capacity of the model (ROC curve area) and adjusted weights for the risk factors (odds ratios) were also determined. BMI averaged 42.5±8.3 kg/m2. DMV was reported in 27 out of 309 patients (8.8%; 95%CI 5.6-11.9%). The multivariate analysis retained NC (OR 1.17; P2 associated factors as the best discriminating point for DMV. Obese patients show increased incidence of DMV with respect to the undifferentiated surgical population. Limited PM, Mallampati test and NC are important DMV predictors.

  11. Further insights on the French WISC-IV factor structure through Bayesian structural equation modeling.

    Science.gov (United States)

    Golay, Philippe; Reverte, Isabelle; Rossier, Jérôme; Favez, Nicolas; Lecerf, Thierry

    2013-06-01

    The interpretation of the Wechsler Intelligence Scale for Children--Fourth Edition (WISC-IV) is based on a 4-factor model, which is only partially compatible with the mainstream Cattell-Horn-Carroll (CHC) model of intelligence measurement. The structure of cognitive batteries is frequently analyzed via exploratory factor analysis and/or confirmatory factor analysis. With classical confirmatory factor analysis, almost all cross-loadings between latent variables and measures are fixed to zero in order to allow the model to be identified. However, inappropriate zero cross-loadings can contribute to poor model fit, distorted factors, and biased factor correlations; most important, they do not necessarily faithfully reflect theory. To deal with these methodological and theoretical limitations, we used a new statistical approach, Bayesian structural equation modeling (BSEM), among a sample of 249 French-speaking Swiss children (8-12 years). With BSEM, zero-fixed cross-loadings between latent variables and measures are replaced by approximate zeros, based on informative, small-variance priors. Results indicated that a direct hierarchical CHC-based model with 5 factors plus a general intelligence factor better represented the structure of the WISC-IV than did the 4-factor structure and the higher order models. Because a direct hierarchical CHC model was more adequate, it was concluded that the general factor should be considered as a breadth rather than a superordinate factor. Because it was possible for us to estimate the influence of each of the latent variables on the 15 subtest scores, BSEM allowed improvement of the understanding of the structure of intelligence tests and the clinical interpretation of the subtest scores. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Analysis on Influent Factors Affecting Vocational School Students' Mathematics Learning Based on Cumulative Logistic Model%职校生数学成绩影响因素的累积LOGISTIC模型分析

    Institute of Scientific and Technical Information of China (English)

    张旭; 刘玉春; 杨志红; 霍素凤

    2012-01-01

    利用累积Logistic模型对影响职校生数学成绩的因素进行了实证分析,研究结论显示,影响职校生数学成绩的主要因素是学生的计算能力、学习主动性以及家长对学生学习的关注,这与职业学校生源的复杂性和职业学校数学教育的特殊性有关.ROC分析表明,利用计算能力、学习主动性、家长的关注三个因素诊断学生数学成绩是合适的,提示教师要特别关注学生计算能力、学习能力的培养,督促家长关注孩子的学习.%Cumulative Logistic model is introduced to analyze influent factors impacting vocational school students' math learning. Results show that ability to calculate,active learning and concern of parents are major factors affecting vocational school students' math learning,it is related to the complexity of students and the special nature of math education in vocational schools. ROC analysis shows that it is appropriate to use computing power of students,active learning and concern of parents to diagnose students' math scores,and suggests that teachers should pay special attention to train students' computing ability and learning ability,and urge parents to concern about children's learning.

  13. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    Full Text Available The interpretation of the several variables that contribute to defining milk quality is difficult due to the high degree of  correlation among them. In this case, one of the best methods of statistical processing is factor analysis, which belongs  to the multivariate groups; for our study this particular statistical approach was employed.  A total of 1485 individual goat milk samples from 117 Girgentana goats, were collected fortnightly from January to July,  and analysed for physical and chemical composition, and clotting properties. Milk pH and tritable acidity were within the  normal range for fresh goat milk. Morning milk yield resulted 704 ± 323 g with 3.93 ± 1.23% and 3.48±0.38% for fat  and protein percentages, respectively. The milk urea content was 43.70 ± 8.28 mg/dl. The clotting ability of Girgentana  milk was quite good, with a renneting time equal to 16.96 ± 3.08 minutes, a rate of curd formation of 2.01 ± 1.63 min-  utes and a curd firmness of 25.08 ± 7.67 millimetres.  Factor analysis was performed by applying axis orthogonal rotation (rotation type VARIMAX; the analysis grouped the  milk components into three latent or common factors. The first, which explained 51.2% of the total covariance, was  defined as “slow milks”, because it was linked to r and pH. The second latent factor, which explained 36.2% of the total  covariance, was defined as “milk yield”, because it is positively correlated to the morning milk yield and to the urea con-  tent, whilst negatively correlated to the fat percentage. The third latent factor, which explained 12.6% of the total covari-  ance, was defined as “curd firmness,” because it is linked to protein percentage, a30 and titatrable acidity. With the aim  of evaluating the influence of environmental effects (stage of kidding, parity and type of kidding, factor scores were anal-  ysed with the mixed linear model. Results showed significant effects of the season of

  14. Housing Price Forecastability: A Factor Analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    2016-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...

  15. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Møller, Stig Vinther; Bork, Lasse

    2017-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...

  16. MODAL ANALYSIS OF QUARTER CAR MODEL SUSPENSION SYSTEM

    OpenAIRE

    Viswanath. K. Allamraju *

    2016-01-01

    Suspension system is very important for comfort driving and travelling of the passengers. Therefore, this study provides a numerical tool for modeling and analyzing of a two degree of freedom quarter car model suspension system. Modal analysis places a vital role in designing the suspension system. In this paper presented the modal analysis of quarter car model suspension system by considering the undamped and damped factors.  The modal and vertical equations of motions describing the su...

  17. Impact Factors of Energy Productivity in China: An Empirical Analysis

    Institute of Scientific and Technical Information of China (English)

    Wei Chu; Shen Manhong

    2007-01-01

    This article developed a decomposition model of energy productivity on the basis of the economic growth model. Four factors were considered which may influence China's energy productivity according to this model: technology improvement, resource allocation structure, industrial structure and institute arrangement. Then, an econometric model was employed to test the four factors empirically on the basis of China's statistical data from 1978 to 2004. Results indicated that capital deepening contributes the most (207%) to energy efficiency improvement, and impact from labor forces (13%) is the weakest one in resource factor; industrial structure (7%) and institute innovation (9.5%) positively improve the energy productivity.

  18. Ventilation Model and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    V. Chipman

    2003-07-18

    This model and analysis report develops, validates, and implements a conceptual model for heat transfer in and around a ventilated emplacement drift. This conceptual model includes thermal radiation between the waste package and the drift wall, convection from the waste package and drift wall surfaces into the flowing air, and conduction in the surrounding host rock. These heat transfer processes are coupled and vary both temporally and spatially, so numerical and analytical methods are used to implement the mathematical equations which describe the conceptual model. These numerical and analytical methods predict the transient response of the system, at the drift scale, in terms of spatially varying temperatures and ventilation efficiencies. The ventilation efficiency describes the effectiveness of the ventilation process in removing radionuclide decay heat from the drift environment. An alternative conceptual model is also developed which evaluates the influence of water and water vapor mass transport on the ventilation efficiency. These effects are described using analytical methods which bound the contribution of latent heat to the system, quantify the effects of varying degrees of host rock saturation (and hence host rock thermal conductivity) on the ventilation efficiency, and evaluate the effects of vapor and enhanced vapor diffusion on the host rock thermal conductivity.

  19. Signs and symptoms of acute mania: a factor analysis

    Directory of Open Access Journals (Sweden)

    de Silva Varuni A

    2011-08-01

    Full Text Available Abstract Background The major diagnostic classifications consider mania as a uni-dimensional illness. Factor analytic studies of acute mania are fewer compared to schizophrenia and depression. Evidence from factor analysis suggests more categories or subtypes than what is included in the classification systems. Studies have found that these factors can predict differences in treatment response and prognosis. Methods The sample included 131 patients consecutively admitted to an acute psychiatry unit over a period of one year. It included 76 (58% males. The mean age was 44.05 years (SD = 15.6. Patients met International Classification of Diseases-10 (ICD-10 clinical diagnostic criteria for a manic episode. Patients with a diagnosis of mixed bipolar affective disorder were excluded. Participants were evaluated using the Young Mania Rating Scale (YMRS. Exploratory factor analysis (principal component analysis was carried out and factors with an eigenvalue > 1 were retained. The significance level for interpretation of factor loadings was 0.40. The unrotated component matrix identified five factors. Oblique rotation was then carried out to identify three factors which were clinically meaningful. Results Unrotated principal component analysis extracted five factors. These five factors explained 65.36% of the total variance. Oblique rotation extracted 3 factors. Factor 1 corresponding to 'irritable mania' had significant loadings of irritability, increased motor activity/energy and disruptive aggressive behaviour. Factor 2 corresponding to 'elated mania' had significant loadings of elevated mood, language abnormalities/thought disorder, increased sexual interest and poor insight. Factor 3 corresponding to 'psychotic mania' had significant loadings of abnormalities in thought content, appearance, poor sleep and speech abnormalities. Conclusions Our findings identified three clinically meaningful factors corresponding to 'elated mania', 'irritable mania

  20. TANK48 CFD MODELING ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    2011-05-17

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank to ensure uniformity of the discharge stream. Mixing is accomplished with one to four dual-nozzle slurry pumps located within the tank liquid. For the work, a Tank 48 simulation model with a maximum of four slurry pumps in operation has been developed to estimate flow patterns for efficient solid mixing. The modeling calculations were performed by using two modeling approaches. One approach is a single-phase Computational Fluid Dynamics (CFD) model to evaluate the flow patterns and qualitative mixing behaviors for a range of different modeling conditions since the model was previously benchmarked against the test results. The other is a two-phase CFD model to estimate solid concentrations in a quantitative way by solving the Eulerian governing equations for the continuous fluid and discrete solid phases over the entire fluid domain of Tank 48. The two-phase results should be considered as the preliminary scoping calculations since the model was not validated against the test results yet. A series of sensitivity calculations for different numbers of pumps and operating conditions has been performed to provide operational guidance for solids suspension and mixing in the tank. In the analysis, the pump was assumed to be stationary. Major solid obstructions including the pump housing, the pump columns, and the 82 inch central support column were included. The steady state and three-dimensional analyses with a two-equation turbulence model were performed with FLUENT{trademark} for the single-phase approach and CFX for the two-phase approach. Recommended operational guidance was developed assuming that local fluid velocity can be used as a measure of sludge suspension and spatial mixing under single-phase tank model. For quantitative analysis, a two-phase fluid-solid model was developed for the same modeling conditions as the single

  1. Factor Structure Analysis of the Schutte Self-Report Emotional Intelligence Scale on International Students

    Science.gov (United States)

    Ng, Kok-Mun; Wang, Chuang; Kim, Do-Hong; Bodenhorn, Nancy

    2010-01-01

    The authors investigated the factor structure of the Schutte Self-Report Emotional Intelligence (SSREI) scale on international students. Via confirmatory factor analysis, the authors tested the fit of the models reported by Schutte et al. and five other studies to data from 640 international students in the United States. Results show that…

  2. MULTIDIMENSIONAL RELIABILITY OF INSTRUMENT STUDENTS’ SATISFACTION USING CONFIRMATORY FACTOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Gaguk Margono

    2014-11-01

    Full Text Available The purpose of this paper is to compare unidimensional reliability and multidimensional reliability of instrument students’ satisfaction as an internal costumer. Multidimensional reliability measurement is rarely used in the field of research. Multidimensional reliability is estimated by using Confirmatory Factor Analysis (CFA on the Structural Equation Model (SEM. Measurements and calculations are described in this article using instrument students’ satisfaction as an internal costumer. Survey method used in this study and sampling used simple random sampling. This instrument has been tried out to 173 students. The result is concluded that the measuringinstrument of students’ satisfaction as an internal costumer by using multidimensional reliability coefficient has higher accuracy when compared with a unidimensional reliability coefficient. Expected in advanced research used another formula multidimensional reliability, including when using SEM.

  3. ANALYSIS OF RISK FACTORS ECTOPIC PREGNANCY

    Directory of Open Access Journals (Sweden)

    Budi Santoso

    2017-04-01

    Full Text Available Introduction: Ectopic pregnancy is a pregnancy with extrauterine implantation. This situation is gynecologic emergency that contributes to maternal mortality. Therefore, early recognition, based on identification of the causes of ectopic pregnancy risk factors, is needed. Methods: The design descriptive observational. The samples were pregnant women who had ectopic pregnancy at Maternity Room, Emergency Unit, Dr. Soetomo Hospital, Surabaya, from 1 July 2008 to 1 July 2010. Sampling technique was total sampling using medical records. Result: Patients with ectopic pregnancy were 99 individuals out of 2090 pregnant women who searched for treatment in Dr. Soetomo Hospital. However, only 29 patients were accompanied with traceable risk factors. Discussion:. Most ectopic pregnancies were in the age group of 26-30 years, comprising 32 patients (32.32%, then in age groups of 31–35 years as many as 25 patients (25.25%, 18 patients in age group 21–25 years (18.18%, 17 patients in age group 36–40 years (17.17%, 4 patients in age group 41 years and more (4.04%, and the least was in age group of 16–20 years with 3 patients (3.03%. A total of 12 patients with ectopic pregnancy (41.38% had experience of abortion and 6 patients (20.69% each in groups of patients with ectopic pregnancy who used family planning, in those who used family planning as well as ectopic pregnancy patients with history of surgery. There were 2 patients (6.90% of the group of patients ectopic pregnancy who had history of surgery and history of abortion. The incidence rate of ectopic pregnancy was 4.73%, mostly in the second gravidity (34.34%, whereas the nulliparous have the highest prevalence of 39.39%. Acquired risk factors, i.e. history of operations was 10.34%, patients with family planning 20.69%, patients with history of abortion 41.38%, patients with history of abortion and operation 6.90% patients with family and history of abortion was 20.69%.

  4. Bayesian modeling in conjoint analysis

    Directory of Open Access Journals (Sweden)

    Janković-Milić Vesna

    2010-01-01

    Full Text Available Statistical analysis in marketing is largely influenced by the availability of various types of data. There is sudden increase in the number and types of information available to market researchers in the last decade. In such conditions, traditional statistical methods have limited ability to solve problems related to the expression of market uncertainty. The aim of this paper is to highlight the advantages of bayesian inference, as an alternative approach to classical inference. Multivariate statistic methods offer extremely powerful tools to achieve many goals of marketing research. One of these methods is the conjoint analysis, which provides a quantitative measure of the relative importance of product or service attributes in relation to the other attribute. The application of this method involves interviewing consumers, where they express their preferences, and statistical analysis provides numerical indicators of each attribute utility. One of the main objections to the method of discrete choice in the conjoint analysis is to use this method to estimate the utility only at the aggregate level and by expressing the average utility for all respondents in the survey. Application of hierarchical Bayesian models enables capturing of individual utility ratings for each attribute level.

  5. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  6. Instrumental Variable Bayesian Model Averaging via Conditional Bayes Factors

    OpenAIRE

    Karl, Anna; Lenkoski, Alex

    2012-01-01

    We develop a method to perform model averaging in two-stage linear regression systems subject to endogeneity. Our method extends an existing Gibbs sampler for instrumental variables to incorporate a component of model uncertainty. Direct evaluation of model probabilities is intractable in this setting. We show that by nesting model moves inside the Gibbs sampler, model comparison can be performed via conditional Bayes factors, leading to straightforward calculations. This new Gibbs sampler is...

  7. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2002-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures.Distribution System Modeling and Analysis helps prevent those errors. It gives re

  8. Nucleon form factors in the canonically quantized Skyrme model

    Energy Technology Data Exchange (ETDEWEB)

    Acus, A.; Norvaisas, E. [Lithuanian Academy of Sciences, Vilnius (Lithuania). Inst. of Theoretical Physics and Astronomy; Riska, D.O. [Helsinki Univ. (Finland). Dept. of Physics; Helsinki Univ. (Finland). Helsinki Inst. of Physics

    2001-08-01

    The explicit expressions for the electric, magnetic, axial and induced pseudoscalar form factors of the nucleons are derived in the ab initio quantized Skyrme model. The canonical quantization procedure ensures the existence of stable soliton solutions with good quantum numbers. The form factors are derived for representations of arbitrary dimension of the SU(2) group. After fixing the two parameters of the model, f{sub {pi}} and e, by the empirical mass and electric mean square radius of the proton, the calculated electric and magnetic form factors are fairly close to the empirical ones, whereas the the axial and induced pseudoscalar form factors fall off too slowly with momentum transfer. (orig.)

  9. Nucleon form factors in the canonically quantized Skyrme model

    CERN Document Server

    Acus, A; Riska, D O

    2001-01-01

    The explicit expressions for the electric, magnetic, axial and induced pseudoscalar form factors of the nucleons are derived in the {\\it ab initio} quantized Skyrme model. The canonical quantization procedure ensures the existence of stable soliton solutions with good quantum numbers. The form factors are derived for representations of arbitrary dimension of the SU(2) group. After fixing the two parameters of the model, $f_\\pi$ and $e$, by the empirical mass and electric mean square radius of the proton, the calculated electric and magnetic form factors are fairly close to the empirical ones, whereas the the axial and induced pseudoscalar form factors fall off too slowly with momentum transfer.

  10. Parent Ratings of the Strengths and Difficulties Questionnaire: What Is the Optimum Factor Model?

    Science.gov (United States)

    Gomez, Rapson; Stavropoulos, Vasilis

    2017-07-01

    To date, at least 12 different models have been suggested for the Strengths and Difficulties Questionnaire (SDQ). The current study used confirmatory factor analysis to examine the relative support for these models. In all, 1,407 Malaysian parents completed SDQ ratings of their children (age range = 5-13 years). Although the findings showed some degree of support for all 12 models, there was most support for an oblique six-factor model that included the five SDQ domains (emotional problems, conduct problems, hyperactivity, peer problems, and low prosocial behavior) and a positive construal factor comprising all the 10 SDQ positive worded items. The original proposed five-factor oblique model also showed good fit. The implications of the findings for understanding the results of past studies of the structural models of the parent version of the SDQ, and for clinical and research practice involving the SDQ are discussed.

  11. Stability Analysis of Train Movement with Uncertain Factors

    Directory of Open Access Journals (Sweden)

    JingJing Ye

    2015-01-01

    Full Text Available We propose a new traffic model which is based on the traditional OV (optimal velocity car-following model. Here, some realistic factors are regarded as uncertain quantity, such as the headway distance. Our aim is to analyze and discuss the stability of car-following model under the constraint of uncertain factors. Then, according to the principle of expected value in fuzzy theory, an improved OV traffic model is constructed. Simulation results show that our proposed model can avoid collisions effectively under uncertain environment, and its stability can also be improved. Moreover, we discuss its stability as some parameters change, such as the relaxation time.

  12. Evaluation of the Thermodynamic Models for the Thermal Diffusion Factor

    DEFF Research Database (Denmark)

    Gonzalez-Bagnoli, Mariana G.; Shapiro, Alexander; Stenby, Erling Halfdan

    2003-01-01

    Over the years, several thermodynamic models for the thermal diffusion factors for binary mixtures have been proposed. The goal of this paper is to test some of these models in combination with different equations of state. We tested the following models: those proposed by Rutherford and Drickame...

  13. FACTOR ANALYSIS OF THE ROLLING PROCESS TECHNOLOGY (part 1

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2011-01-01

    Full Text Available The mathematical model of multidimensional regression analysis is presented. Its practical application for rolling production is examined. The algorithm of special characteristics determination is developed.

  14. Dynamic Factor Method of Computing Dynamic Mathematical Model for System Simulation

    Institute of Scientific and Technical Information of China (English)

    老大中; 吴娟; 杨策; 蒋滋康

    2003-01-01

    The computational methods of a typical dynamic mathematical model that can describe the differential element and the inertial element for the system simulation are researched. The stability of numerical solutions of the dynamic mathematical model is researched. By means of theoretical analysis, the error formulas, the error sign criteria and the error relationship criterion of the implicit Euler method and the trapezoidal method are given, the dynamic factor affecting the computational accuracy has been found, the formula and the methods of computing the dynamic factor are given. The computational accuracy of the dynamic mathematical model like this can be improved by use of the dynamic factor.

  15. Generalized Dynamic Factor Model + GARCH Exploiting Multivariate Information for Univariate Prediction

    OpenAIRE

    Alessi, Lucia; Barigozzi, Matteo; Capasso, Marco

    2006-01-01

    We propose a new model for multivariate forecasting which combines the Generalized Dynamic Factor Model (GDFM)and the GARCH model. The GDFM, applied to a huge number of series, captures the multivariate information and disentangles the common and the idiosyncratic part of each series of returns. In this financial analysis, both these components are modeled as a GARCH. We compare GDFM+GARCH and standard GARCH performance on samples up to 475 series, predicting both levels and volatility of ret...

  16. The determinant factors of open business model

    Directory of Open Access Journals (Sweden)

    Juan Mejía-Trejo

    2017-01-01

    Full Text Available Intro ducción : Desde principios del siglo XXI, varios autores afirman que los modelos de negocio abiertos (OBM permiten a una organización ser más eficaz en la creación y la ca p tura de valor siendo un requisito previo para el éxito de las asociaciones de co - des arrollo. Como resultado de las tendencias de: crecientes costos de desarrollo y ciclos de vida de los produ c tos/servicios más cortos, las empresas encuentran cada vez más difícil justificar las inversi o nes en innovación. El OBM resuelve ambas tendencias, s ubrayando los términos: " ecosistema de la industria " y/o " modelo de negocio colaborativo ". No sólo cambia el pr o ceso de innovación, sino que también modifica a las propias organizaciones mediante la r e configuración de sus cadenas de valor y redes. Para las empresas, crea una lógica heurística basada en el actual modelo de negocio y tecnología para extenderlas, con estrategia, al desa r rollo de la innov a ción para crear valor y aumentar los ingresos y beneficios. Enfatiza tanto las relaciones exte r nas así como la gobernabilidad, como valiosos recursos con varios roles que promueven la competitividad corporativa. Por lo tanto, para un sector especializado de alta tecnología como lo es el de las tecnologías de la información de la zona metropolitana de Guadalajar a (IT S MZG, exponemos el siguiente problema de investigación: ¿Cuáles son los factores determinantes de la OBM como modelo empírico que se aplc a do en el ITSMZG? Método: Como se ve, esta investigación tiene como objetivo plantear, los factores determ i nantes de la OBM como un modelo empírico que sea aplicado en el ITSMZG.Se trata de un estudio documental para seleccionar las principales v a riables entre los especialistas de las ITSMZG que practican el proceso OBM mediante el proceso de j e rarquía analítica (AHP y el Panel de Delphi a fin de contrastar los términos académicos con la experiencia de los e s pecialistas. Es un

  17. A dynamic factor model of the evaluation of the financial crisis in Turkey.

    Science.gov (United States)

    Sezgin, F; Kinay, B

    2010-01-01

    Factor analysis has been widely used in economics and finance in situations where a relatively large number of variables are believed to be driven by few common causes of variation. Dynamic factor analysis (DFA) which is a combination of factor and time series analysis, involves autocorrelation matrices calculated from multivariate time series. Dynamic factor models were traditionally used to construct economic indicators, macroeconomic analysis, business cycles and forecasting. In recent years, dynamic factor models have become more popular in empirical macroeconomics. They have more advantages than other methods in various respects. Factor models can for instance cope with many variables without running into scarce degrees of freedom problems often faced in regression-based analysis. In this study, a model which determines the effect of the global crisis on Turkey is proposed. The main aim of the paper is to analyze how several macroeconomic quantities show an alteration before the evolution of the crisis and to decide if a crisis can be forecasted or not.

  18. Loss Given Default Modelling: Comparative Analysis

    OpenAIRE

    Yashkir, Olga; Yashkir, Yuriy

    2013-01-01

    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  19. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  20. A systematic review of the main factors that determine agility in sport using structural equation modeling.

    Science.gov (United States)

    Hojka, Vladimir; Stastny, Petr; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-09-01

    While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM.