WorldWideScience

Sample records for factor analysis method

  1. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  2. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  3. Methods of selecting factors in the analysis of the real estates market

    OpenAIRE

    Jasińska, Elżbieta; Preweda, Edward

    2006-01-01

    In the paper the problem of selecting the method of choosing factors in factorial analysis is presented. For the database of 61 real estates the process of singling out the factors was carried out with the use of all the methods proposed in the STATISTICA 6.0 pack. A particular attention was paid on the number of differentiated factors and the efficiency of subsequent methods for the analysis of the real estates market. Edward Preweda

  4. [A factor analysis method for contingency table data with unlimited multiple choice questions].

    Science.gov (United States)

    Toyoda, Hideki; Haiden, Reina; Kubo, Saori; Ikehara, Kazuya; Isobe, Yurie

    2016-02-01

    The purpose of this study is to propose a method of factor analysis for analyzing contingency tables developed from the data of unlimited multiple-choice questions. This method assumes that the element of each cell of the contingency table has a binominal distribution and a factor analysis model is applied to the logit of the selection probability. Scree plot and WAIC are used to decide the number of factors, and the standardized residual, the standardized difference between the sample, and the proportion ratio, is used to select items. The proposed method was applied to real product impression research data on advertised chips and energy drinks. Since the results of the analysis showed that this method could be used in conjunction with conventional factor analysis model, and extracted factors were fully interpretable, and suggests the usefulness of the proposed method in the study of psychology using unlimited multiple-choice questions.

  5. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  6. Identification of advanced human factors engineering analysis, design and evaluation methods

    International Nuclear Information System (INIS)

    Plott, C.; Ronan, A. M.; Laux, L.; Bzostek, J.; Milanski, J.; Scheff, S.

    2006-01-01

    NUREG-0711 Rev.2, 'Human Factors Engineering Program Review Model,' provides comprehensive guidance to the Nuclear Regulatory Commission (NRC) in assessing the human factors practices employed by license applicants for Nuclear Power Plant control room designs. As software based human-system interface (HSI) technologies supplant traditional hardware-based technologies, the NRC may encounter new HSI technologies or seemingly unconventional approaches to human factors design, analysis, and evaluation methods which NUREG-0711 does not anticipate. A comprehensive survey was performed to identify advanced human factors engineering analysis, design and evaluation methods, tools, and technologies that the NRC may encounter in near term future licensee applications. A review was conducted to identify human factors methods, tools, and technologies relevant to each review element of NUREG-0711. Additionally emerging trends in technology which have the potential to impact review elements, such as Augmented Cognition, and various wireless tools and technologies were identified. The purpose of this paper is to provide an overview of the survey results and to highlight issues that could be revised or adapted to meet with emerging trends. (authors)

  7. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  8. Comparative Analysis Of Dempster Shafer Method With Certainty Factor Method For Diagnose Stroke Diseases

    Directory of Open Access Journals (Sweden)

    Erwin Kuit Panggabean

    2018-02-01

    Full Text Available The development of artificial intelligence technology that has occurred has allowed expert systems to be applied in detecting disease using programming languages. One in terms of providing information about a variety of disease problems that have recently been feared by Indonesian society, namely stroke. Expert system method used is dempster shafer and certainty factor method is used to analyze the comparison of both methods in stroke.Based on the analysis result, it is found that certainty factor is better than demster shafer and more accurate in handling the knowledge representation of stoke disease according to the symptoms of disease obtained from one hospital in medan city, uniqueness of algorithm that exist in both methods.

  9. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  10. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  11. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  12. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    Science.gov (United States)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  13. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  14. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  15. Effect of abiotic and biotic stress factors analysis using machine learning methods in zebrafish.

    Science.gov (United States)

    Gutha, Rajasekar; Yarrappagaari, Suresh; Thopireddy, Lavanya; Reddy, Kesireddy Sathyavelu; Saddala, Rajeswara Reddy

    2018-03-01

    In order to understand the mechanisms underlying stress responses, meta-analysis of transcriptome is made to identify differentially expressed genes (DEGs) and their biological, molecular and cellular mechanisms in response to stressors. The present study is aimed at identifying the effect of abiotic and biotic stress factors, and it is found that several stress responsive genes are common for both abiotic and biotic stress factors in zebrafish. The meta-analysis of micro-array studies revealed that almost 4.7% i.e., 108 common DEGs are differentially regulated between abiotic and biotic stresses. This shows that there is a global coordination and fine-tuning of gene regulation in response to these two types of challenges. We also performed dimension reduction methods, principal component analysis, and partial least squares discriminant analysis which are able to segregate abiotic and biotic stresses into separate entities. The supervised machine learning model, recursive-support vector machine, could classify abiotic and biotic stresses with 100% accuracy using a subset of DEGs. Beside these methods, the random forests decision tree model classified five out of 8 stress conditions with high accuracy. Finally, Functional enrichment analysis revealed the different gene ontology terms, transcription factors and miRNAs factors in the regulation of stress responses. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  17. Comparing 3 dietary pattern methods--cluster analysis, factor analysis, and index analysis--With colorectal cancer risk: The NIH-AARP Diet and Health Study.

    Science.gov (United States)

    Reedy, Jill; Wirfält, Elisabet; Flood, Andrew; Mitrou, Panagiota N; Krebs-Smith, Susan M; Kipnis, Victor; Midthune, Douglas; Leitzmann, Michael; Hollenbeck, Albert; Schatzkin, Arthur; Subar, Amy F

    2010-02-15

    The authors compared dietary pattern methods-cluster analysis, factor analysis, and index analysis-with colorectal cancer risk in the National Institutes of Health (NIH)-AARP Diet and Health Study (n = 492,306). Data from a 124-item food frequency questionnaire (1995-1996) were used to identify 4 clusters for men (3 clusters for women), 3 factors, and 4 indexes. Comparisons were made with adjusted relative risks and 95% confidence intervals, distributions of individuals in clusters by quintile of factor and index scores, and health behavior characteristics. During 5 years of follow-up through 2000, 3,110 colorectal cancer cases were ascertained. In men, the vegetables and fruits cluster, the fruits and vegetables factor, the fat-reduced/diet foods factor, and all indexes were associated with reduced risk; the meat and potatoes factor was associated with increased risk. In women, reduced risk was found with the Healthy Eating Index-2005 and increased risk with the meat and potatoes factor. For men, beneficial health characteristics were seen with all fruit/vegetable patterns, diet foods patterns, and indexes, while poorer health characteristics were found with meat patterns. For women, findings were similar except that poorer health characteristics were seen with diet foods patterns. Similarities were found across methods, suggesting basic qualities of healthy diets. Nonetheless, findings vary because each method answers a different question.

  18. A comparison of confirmatory factor analysis methods : Oblique multiple group method versus confirmatory common factor method

    NARCIS (Netherlands)

    Stuive, Ilse

    2007-01-01

    Confirmatieve Factor Analyse (CFA) is een vaak gebruikte methode wanneer onderzoekers een bepaalde veronderstelling hebben over de indeling van items in één of meerdere subtests en willen onderzoeken of deze indeling ook wordt ondersteund door verzamelde onderzoeksgegevens. De meest gebruikte

  19. Method for exploiting bias in factor analysis using constrained alternating least squares algorithms

    Science.gov (United States)

    Keenan, Michael R.

    2008-12-30

    Bias plays an important role in factor analysis and is often implicitly made use of, for example, to constrain solutions to factors that conform to physical reality. However, when components are collinear, a large range of solutions may exist that satisfy the basic constraints and fit the data equally well. In such cases, the introduction of mathematical bias through the application of constraints may select solutions that are less than optimal. The biased alternating least squares algorithm of the present invention can offset mathematical bias introduced by constraints in the standard alternating least squares analysis to achieve factor solutions that are most consistent with physical reality. In addition, these methods can be used to explicitly exploit bias to provide alternative views and provide additional insights into spectral data sets.

  20. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  1. Prediction of quality attributes of chicken breast fillets by using Vis/NIR spectroscopy combined with factor analysis method

    Science.gov (United States)

    Visible/near-infrared (Vis/NIR) spectroscopy with wavelength range between 400 and 2500 nm combined with factor analysis method was tested to predict quality attributes of chicken breast fillets. Quality attributes, including color (L*, a*, b*), pH, and drip loss were analyzed using factor analysis ...

  2. Factors Analysis And Profit Achievement For Trading Company By Using Rough Set Method

    Directory of Open Access Journals (Sweden)

    Muhammad Ardiansyah Sembiring

    2017-06-01

    Full Text Available This research has been done to analysis the financial raport fortrading company and it is  intimately  related  to  some  factors  which  determine  the profit of company. The result of this reseach is showed about  New Knowledge and perform of the rule. In  discussion, by followed data mining process and using Rough Set method. Rough Set is to analyzed the performance of the result. This  reseach will be assist to the manager of company with draw the intactandobjective. Rough set method is also to difined  the rule of discovery process and started the formation about Decision System, Equivalence Class, Discernibility Matrix,  Discernibility Matrix Modulo D, Reduction and General Rules. Rough set method is efective model about the performing analysis in the company.   Keywords : Data Mining, General Rules, Profit,. Rough Set.

  3. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    Science.gov (United States)

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  4. Source apportionment of PAH in Hamilton Harbour suspended sediments: comparison of two factor analysis methods.

    Science.gov (United States)

    Sofowote, Uwayemi M; McCarry, Brian E; Marvin, Christopher H

    2008-08-15

    A total of 26 suspended sediment samples collected over a 5-year period in Hamilton Harbour, Ontario, Canada and surrounding creeks were analyzed for a suite of polycyclic aromatic hydrocarbons and sulfur heterocycles. Hamilton Harbour sediments contain relatively high levels of polycyclic aromatic compounds and heavy metals due to emissions from industrial and mobile sources. Two receptor modeling methods using factor analyses were compared to determine the profiles and relative contributions of pollution sources to the harbor; these methods are principal component analyses (PCA) with multiple linear regression analysis (MLR) and positive matrix factorization (PMF). Both methods identified four factors and gave excellent correlation coefficients between predicted and measured levels of 25 aromatic compounds; both methods predicted similar contributions from coal tar/coal combustion sources to the harbor (19 and 26%, respectively). One PCA factor was identified as contributions from vehicular emissions (61%); PMF was able to differentiate vehicular emissions into two factors, one attributed to gasoline emissions sources (28%) and the other to diesel emissions sources (24%). Overall, PMF afforded better source identification than PCA with MLR. This work constitutes one of the few examples of the application of PMF to the source apportionment of sediments; the addition of sulfur heterocycles to the analyte list greatly aided in the source identification process.

  5. A probabilistic analysis method to evaluate the effect of human factors on plant safety

    International Nuclear Information System (INIS)

    Ujita, H.

    1987-01-01

    A method to evaluate the effect of human factors on probabilistic safety analysis (PSA) is developed. The main features of the method are as follows: 1. A time-dependent multibranch tree is constructed to treat time dependency of human error probability. 2. A sensitivity analysis is done to determine uncertainty in the PSA due to branch time of human error occurrence, human error data source, extraneous act probability, and human recovery probability. The method is applied to a large-break, loss-of-coolant accident of a boiling water reactor-5. As a result, core melt probability and risk do not depend on the number of time branches, which means that a small number of branches are sufficient. These values depend on the first branch time and the human error probability

  6. Analyzing the Impacts of Alternated Number of Iterations in Multiple Imputation Method on Explanatory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Duygu KOÇAK

    2017-11-01

    Full Text Available The study aims to identify the effects of iteration numbers used in multiple iteration method, one of the methods used to cope with missing values, on the results of factor analysis. With this aim, artificial datasets of different sample sizes were created. Missing values at random and missing values at complete random were created in various ratios by deleting data. For the data in random missing values, a second variable was iterated at ordinal scale level and datasets with different ratios of missing values were obtained based on the levels of this variable. The data were generated using “psych” program in R software, while “dplyr” program was used to create codes that would delete values according to predetermined conditions of missing value mechanism. Different datasets were generated by applying different iteration numbers. Explanatory factor analysis was conducted on the datasets completed and the factors and total explained variances are presented. These values were first evaluated based on the number of factors and total variance explained of the complete datasets. The results indicate that multiple iteration method yields a better performance in cases of missing values at random compared to datasets with missing values at complete random. Also, it was found that increasing the number of iterations in both missing value datasets decreases the difference in the results obtained from complete datasets.

  7. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  8. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  9. Factors and methods of analysis and estimation of furniture making enterprises competitiveness

    Directory of Open Access Journals (Sweden)

    Vitaliy Aleksandrovich Zhigarev

    2015-06-01

    Full Text Available Objective to describe the author39s methodology for estimating the furnituremaking enterprises competitiveness with a view to carry out the economic evaluation of the efficiency of furniture production the evaluation of the internal component of the furniture production efficiency the identification of factors influencing the efficiency of furnituremaking companies and areas for improving it through improvements in product range production and sales policy of the enterprise. The research subject is modern methods and principles of competitiveness management applicable in a rapidly changing market environment. Methods in general the research methodology consists of six stages differentiated by methods objectives and required outcomes. The first stage of the research was to study the nature of demand within the target market of a furnituremaking enterprise. The second stage was to study the expenditures of a furnituremaking enterprise for implementing individual production and sales strategies. The third stage was to study competition in the market. The fourth stage was the analysis of possibilities of a furnituremaking enterprise in producing and selling furniture in terms of factor values combinations. The fifth stage was the reexamination of the demand with a view to its distribution according to the factor space. The final sixth stage was processing of data obtained at the previous stages and carrying out the necessary calculations. Results in general the above methodology of economic evaluation of the efficiency of furniture production based on the previously developed model gives the managers of enterprises an algorithm for assessing both market and firmlevel component of the furniture production efficiency allowing the subsequent identification and evaluation of the efficiency factors and the development of measures to improve the furniture production and sale efficiency as well as the assortment rationalization production and sales policy

  10. The Infinitesimal Jackknife with Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  11. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  12. The Hull Method for Selecting the Number of Common Factors

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Timmerman, Marieke E.; Kiers, Henk A. L.

    2011-01-01

    A common problem in exploratory factor analysis is how many factors need to be extracted from a particular data set. We propose a new method for selecting the number of major common factors: the Hull method, which aims to find a model with an optimal balance between model fit and number of parameters. We examine the performance of the method in an…

  13. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    Science.gov (United States)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  14. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  15. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  16. SUPERPIXEL BASED FACTOR ANALYSIS AND TARGET TRANSFORMATION METHOD FOR MARTIAN MINERALS DETECTION

    Directory of Open Access Journals (Sweden)

    X. Wu

    2018-04-01

    Full Text Available The Factor analysis and target transformation (FATT is an effective method to test for the presence of particular mineral on Martian surface. It has been used both in thermal infrared (Thermal Emission Spectrometer, TES and near-infrared (Compact Reconnaissance Imaging Spectrometer for Mars, CRISM hyperspectral data. FATT derived a set of orthogonal eigenvectors from a mixed system and typically selected first 10 eigenvectors to least square fit the library mineral spectra. However, minerals present only in a limited pixels will be ignored because its weak spectral features compared with full image signatures. Here, we proposed a superpixel based FATT method to detect the mineral distributions on Mars. The simple linear iterative clustering (SLIC algorithm was used to partition the CRISM image into multiple connected image regions with spectral homogeneous to enhance the weak signatures by increasing their proportion in a mixed system. A least square fitting was used in target transformation and performed to each region iteratively. Finally, the distribution of the specific minerals in image was obtained, where fitting residual less than a threshold represent presence and otherwise absence. We validate our method by identifying carbonates in a well analysed CRISM image in Nili Fossae on Mars. Our experimental results indicate that the proposed method work well both in simulated and real data sets.

  17. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  18. Influencing factors and kinetics analysis on the leaching of iron from boron carbide waste-scrap with ultrasound-assisted method.

    Science.gov (United States)

    Li, Xin; Xing, Pengfei; Du, Xinghong; Gao, Shuaibo; Chen, Chen

    2017-09-01

    In this paper, the ultrasound-assisted leaching of iron from boron carbide waste-scrap was investigated and the optimization of different influencing factors had also been performed. The factors investigated were acid concentration, liquid-solid ratio, leaching temperature, ultrasonic power and frequency. The leaching of iron with conventional method at various temperatures was also performed. The results show the maximum iron leaching ratios are 87.4%, 94.5% for 80min-leaching with conventional method and 50min-leaching with ultrasound assistance, respectively. The leaching of waste-scrap with conventional method fits the chemical reaction-controlled model. The leaching with ultrasound assistance fits chemical reaction-controlled model, diffusion-controlled model for the first stage and second stage, respectively. The assistance of ultrasound can greatly improve the iron leaching ratio, accelerate the leaching rate, shorten leaching time and lower the residual iron, comparing with conventional method. The advantages of ultrasound-assisted leaching were also confirmed by the SEM-EDS analysis and elemental analysis of the raw material and leached solid samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Prioritization of the Factors Affecting Bank Efficiency Using Combined Data Envelopment Analysis and Analytical Hierarchy Process Methods

    Directory of Open Access Journals (Sweden)

    Mehdi Fallah Jelodar

    2016-01-01

    Full Text Available Bank branches have a vital role in the economy of all countries. They collect assets from various sources and put them in the hand of those sectors that need liquidity. Due to the limited financial and human resources and capitals and also because of the unlimited and new customers’ needs and strong competition between banks and financial and credit institutions, the purpose of this study is to provide an answer to the question of which of the factors affecting performance, creating value, and increasing shareholder dividends are superior to others and consequently managers should pay more attention to them. Therefore, in this study, the factors affecting performance (efficiency in the areas of management, personnel, finance, and customers were segmented and obtained results were ranked using both methods of Data Envelopment Analysis and hierarchical analysis. In both of these methods, the leadership style in the area of management; the recruitment and resource allocation in the area of financing; the employees’ satisfaction, dignity, and self-actualization in the area of employees; and meeting the new needs of customers got more weights.

  20. Human factors analysis of incident/accident report

    International Nuclear Information System (INIS)

    Kuroda, Isao

    1992-01-01

    Human factors analysis of accident/incident has different kinds of difficulties in not only technical, but also psychosocial background. This report introduces some experiments of 'Variation diagram method' which is able to extend to operational and managemental factors. (author)

  1. Using exploratory factor analysis in personality research: Best-practice recommendations

    Directory of Open Access Journals (Sweden)

    Sumaya Laher

    2010-11-01

    Research purpose: This article presents more objective methods to determine the number of factors, most notably parallel analysis and Velicer’s minimum average partial (MAP. The benefits of rotation are also discussed. The article argues for more consistent use of Procrustes rotation and congruence coefficients in factor analytic studies. Motivation for the study: Exploratory factor analysis is often criticised for not being rigorous and objective enough in terms of the methods used to determine the number of factors, the rotations to be used and ultimately the validity of the factor structure. Research design, approach and method: The article adopts a theoretical stance to discuss the best-practice recommendations for factor analytic research in the field of psychology. Following this, an example located within personality assessment and using the NEO-PI-R specifically is presented. A total of 425 students at the University of the Witwatersrand completed the NEO-PI-R. These responses were subjected to a principal components analysis using varimax rotation. The rotated solution was subjected to a Procrustes rotation with Costa and McCrae’s (1992 matrix as the target matrix. Congruence coefficients were also computed. Main findings: The example indicates the use of the methods recommended in the article and demonstrates an objective way of determining the number of factors. It also provides an example of Procrustes rotation with coefficients of agreement as an indication of how factor analytic results may be presented more rigorously in local research. Practical/managerial implications: It is hoped that the recommendations in this article will have best-practice implications for both researchers and practitioners in the field who employ factor analysis regularly. Contribution/value-add: This article will prove useful to all researchers employing factor analysis and has the potential to set the trend for better use of factor analysis in the South African context.

  2. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    Science.gov (United States)

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Factorization method of quadratic template

    Science.gov (United States)

    Kotyrba, Martin

    2017-07-01

    Multiplication of two numbers is a one-way function in mathematics. Any attempt to distribute the outcome to its roots is called factorization. There are many methods such as Fermat's factorization, Dixońs method or quadratic sieve and GNFS, which use sophisticated techniques fast factorization. All the above methods use the same basic formula differing only in its use. This article discusses a newly designed factorization method. Effective implementation of this method in programs is not important, it only represents and clearly defines its properties.

  4. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  5. Missing in space: an evaluation of imputation methods for missing data in spatial analysis of risk factors for type II diabetes.

    Science.gov (United States)

    Baker, Jannah; White, Nicole; Mengersen, Kerrie

    2014-11-20

    Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.

  6. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  7. Unbiased proteomics analysis demonstrates significant variability in mucosal immune factor expression depending on the site and method of collection.

    Directory of Open Access Journals (Sweden)

    Kenzie M Birse

    Full Text Available Female genital tract secretions are commonly sampled by lavage of the ectocervix and vaginal vault or via a sponge inserted into the endocervix for evaluating inflammation status and immune factors critical for HIV microbicide and vaccine studies. This study uses a proteomics approach to comprehensively compare the efficacy of these methods, which sample from different compartments of the female genital tract, for the collection of immune factors. Matching sponge and lavage samples were collected from 10 healthy women and were analyzed by tandem mass spectrometry. Data was analyzed by a combination of differential protein expression analysis, hierarchical clustering and pathway analysis. Of the 385 proteins identified, endocervical sponge samples collected nearly twice as many unique proteins as cervicovaginal lavage (111 vs. 61 with 55% of proteins common to both (213. Each method/site identified 73 unique proteins that have roles in host immunity according to their gene ontology. Sponge samples enriched for specific inflammation pathways including acute phase response proteins (p = 3.37×10(-24 and LXR/RXR immune activation pathways (p = 8.82×10(-22 while the role IL-17A in psoriasis pathway (p = 5.98×10(-4 and the complement system pathway (p = 3.91×10(-3 were enriched in lavage samples. Many host defense factors were differentially enriched (p<0.05 between sites including known/potential antimicrobial factors (n = 21, S100 proteins (n = 9, and immune regulatory factors such as serpins (n = 7. Immunoglobulins (n = 6 were collected at comparable levels in abundance in each site although 25% of those identified were unique to sponge samples. This study demonstrates significant differences in types and quantities of immune factors and inflammation pathways collected by each sampling technique. Therefore, clinical studies that measure mucosal immune activation or factors assessing HIV transmission should utilize

  8. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  9. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    Nigran, K.S.; Barber, D.C.

    1985-01-01

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  10. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  11. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  12. Review of human factors guidelines and methods

    International Nuclear Information System (INIS)

    Rhodes, W.; Szlapetis, I.; Hay, T.; Weihrer, S.

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs

  13. Review of human factors guidelines and methods

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, W; Szlapetis, I; Hay, T; Weihrer, S [Rhodes and Associates Inc., Toronto, ON (Canada)

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs.

  14. comparison of elastic-plastic FE method and engineering method for RPV fracture mechanics analysis

    International Nuclear Information System (INIS)

    Sun Yingxue; Zheng Bin; Zhang Fenggang

    2009-01-01

    This paper described the FE analysis of elastic-plastic fracture mechanics for a crack in RPV belt line using ABAQUS code. It calculated and evaluated the stress intensity factor and J integral of crack under PTS transients. The result is also compared with that by engineering analysis method. It shows that the results using engineering analysis method is a little larger than the results using FE analysis of 3D elastic-plastic fracture mechanics, thus the engineering analysis method is conservative than the elastic-plastic fracture mechanics method. (authors)

  15. Factors in Agile Methods Adoption

    Directory of Open Access Journals (Sweden)

    Samia Abdalhamid

    2017-05-01

    Full Text Available There are many factors that can affect the process of adopting Agile methods during software developing. This paper illustrates the critical factors in Agile methods adoption in software organizations. To present the success and failure factors, an exploratory study is carried out among the critical factors of success and failure from existing studies. Dimensions and Factors are introduced utilizing success and failure dimensions. The mind map was used to clarify these factors.

  16. Factor Analysis for Clustered Observations.

    Science.gov (United States)

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  17. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  18. Seismic analysis response factors and design margins of piping systems

    International Nuclear Information System (INIS)

    Shieh, L.C.; Tsai, N.C.; Yang, M.S.; Wong, W.L.

    1985-01-01

    The objective of the simplified methods project of the Seismic Safety Margins Research Program is to develop a simplified seismic risk methodology for general use. The goal is to reduce seismic PRA costs to roughly 60 man-months over a 6 to 8 month period, without compromising the quality of the product. To achieve the goal, it is necessary to simplify the calculational procedure of the seismic response. The response factor approach serves this purpose. The response factor relates the median level response to the design data. Through a literature survey, we identified the various seismic analysis methods adopted in the U.S. nuclear industry for the piping system. A series of seismic response calculations was performed. The response factors and their variabilities for each method of analysis were computed. A sensitivity study of the effect of piping damping, in-structure response spectra envelop method, and analysis method was conducted. In addition, design margins, which relate the best-estimate response to the design data, are also presented

  19. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  20. Multiple timescale analysis and factor analysis of energy ecological footprint growth in China 1953-2006

    International Nuclear Information System (INIS)

    Chen Chengzhong; Lin Zhenshan

    2008-01-01

    Scientific analysis of energy consumption and its influencing factors is of great importance for energy strategy and policy planning. The energy consumption in China 1953-2006 is estimated by applying the energy ecological footprint (EEF) method, and the fluctuation periods of annual China's per capita EEF (EEF cpc ) growth rate are analyzed with the empirical mode decomposition (EMD) method in this paper. EEF intensity is analyzed to depict energy efficiency in China. The main timescales of the 37 factors that affect the annual growth rate of EEF cpc are also discussed based on EMD and factor analysis methods. Results show three obvious undulation cycles of the annual growth rate of EEF cpc , i.e., 4.6, 14.4 and 34.2 years over the last 53 years. The analysis findings from the common synthesized factors of IMF1, IMF2 and IMF3 timescales of the 37 factors suggest that China's energy policy-makers should attach more importance to stabilizing economic growth, optimizing industrial structure, regulating domestic petroleum exploitation and improving transportation efficiency

  1. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  4. Model Correction Factor Method

    DEFF Research Database (Denmark)

    Christensen, Claus; Randrup-Thomsen, Søren; Morsing Johannesen, Johannes

    1997-01-01

    The model correction factor method is proposed as an alternative to traditional polynomial based response surface techniques in structural reliability considering a computationally time consuming limit state procedure as a 'black box'. The class of polynomial functions is replaced by a limit...... of the model correction factor method, is that in simpler form not using gradient information on the original limit state function or only using this information once, a drastic reduction of the number of limit state evaluation is obtained together with good approximations on the reliability. Methods...

  5. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan; Spell, Gregory; Carin, Lawrence

    2017-04-20

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rank impacts both overcompleteness and sparsity.

  6. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  7. Adjusting for multiple prognostic factors in the analysis of randomised trials

    Science.gov (United States)

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not

  8. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  9. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    International Nuclear Information System (INIS)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed

  10. The Empirical Verification of an Assignment of Items to Subtests : The Oblique Multiple Group Method Versus the Confirmatory Common Factor Method

    NARCIS (Netherlands)

    Stuive, Ilse; Kiers, Henk A.L.; Timmerman, Marieke E.; ten Berge, Jos M.F.

    2008-01-01

    This study compares two confirmatory factor analysis methods on their ability to verify whether correct assignments of items to subtests are supported by the data. The confirmatory common factor (CCF) method is used most often and defines nonzero loadings so that they correspond to the assignment of

  11. A multifactorial analysis of obesity as CVD risk factor: Use of neural network based methods in a nutrigenetics context

    Directory of Open Access Journals (Sweden)

    Valavanis Ioannis K

    2010-09-01

    Full Text Available Abstract Background Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD. The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total, gender, and nutrition (38 in total, e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI as normal (BMI ≤ 25 or overweight (BMI > 25. Two artificial neural network (ANN based methods were designed and used towards the analysis of the available data. These corresponded to i a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN, and ii a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN which combines genetic algorithms and the popular back-propagation training algorithm. Results PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets

  12. A multifactorial analysis of obesity as CVD risk factor: use of neural network based methods in a nutrigenetics context.

    Science.gov (United States)

    Valavanis, Ioannis K; Mougiakakou, Stavroula G; Grimaldi, Keith A; Nikita, Konstantina S

    2010-09-08

    Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. The ANN based methods revealed factors

  13. Limitations of systemic accident analysis methods

    Directory of Open Access Journals (Sweden)

    Casandra Venera BALAN

    2016-12-01

    Full Text Available In terms of system theory, the description of complex accidents is not limited to the analysis of the sequence of events / individual conditions, but highlights nonlinear functional characteristics and frames human or technical performance in relation to normal functioning of the system, in safety conditions. Thus, the research of the system entities as a whole is no longer an abstraction of a concrete situation, but an exceeding of the theoretical limits set by analysis based on linear methods. Despite the issues outlined above, the hypothesis that there isn’t a complete method for accident analysis is supported by the nonlinearity of the considered function or restrictions, imposing a broad vision of the elements introduced in the analysis, so it can identify elements corresponding to nominal parameters or trigger factors.

  14. An Analysis of Construction Accident Factors Based on Bayesian Network

    OpenAIRE

    Yunsheng Zhao; Jinyong Pei

    2013-01-01

    In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...

  15. Importance of development factors in company dealing with cataphoresis coating method

    Directory of Open Access Journals (Sweden)

    Dorota Klimecka-Tatar

    2014-06-01

    Full Text Available The main aim of presented in this paper results is analysis of the most important factors in the company activity. The questionnaire test were carried among persons employed by the company, which mainstream is method of cataphoresis anti-corrosion coating. In the paper also validity of the Toyota roof elements were defined. Based on research as the most important factors of the company mission, indicated the quality factor.

  16. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  17. On the factorization method in quantum mechanics

    OpenAIRE

    Rosas-Ortiz, J. Oscar

    1998-01-01

    New exactly solvable problems have already been studied by using a modification of the factorization method introduced by Mielnik. We review this method and its connection with the traditional factorization method. The survey includes the discussion on a generalization of the factorization energies used in the traditional Infeld and Hull method.

  18. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  19. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  20. THE LATENT INTERCONNECTION OF THE FACTORS OF ATHEROSCLEROSIS PROGRESSION WITH A THICKNESS OF INTIMA-MEDIA BY USE OF MULTIDIMENSIONAL STATISTICAL METHODS ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. P. Shavrin

    2011-01-01

    Full Text Available The aim – the study of latent relationships between indicators of the thickness of intima-media (CMM and infectious, immune, inflammatory and metabolic factors in patients with varying degrees of severity of vascular changes in these multivariate methods of statistical analysis.Materials and methods. Study included 220 patients (mean age – 43,9 ± 0,5 years who were divided into 3 groups. Group 1 consisted of thepatients with no risk factors of cardiovascular disease (CVD, the 2nd – the presence of the above factors, in third – with atherosclerotic plaques in the carotid artery. Every patient had conducted a comprehensive survey, which included an ultrasound of vessels on the apparatus Aloka 5000 with the measurement of the thickness of KIM, the study of lipid panel, the definition of C-reactive protein and cytokines – tumor necrosis factor-α, interferon-γ, interleukin-1, -8, -4, antibodies to cytomegalovirus immunoglobulin (CMV, herpes simplex virus type 1, C. pneumoniae, H. pylori and β-hemolytic streptococcus group A. The immune system status was assessed by indicators of innate and acquired immunity.Results. According to cluster analysis, all groups of patients revealed the presence of close relationships with linear thickness KIM, infectious, immune and metabolic markers, and in patients with atherosclerotic plaques in blood vessels links with indicators of inflammation are additionally found. Using factor analysis latent variables exist revealed, consisting of indices and thickness of the CMM, in group 1 – blood lipids, in the 2nd – infectious factors (CMV, C. pneumoniae and immune parameters. In the 3rd group vascular wall was linked with infectious diseases, immune and inflammatory indices and blood lipids, and systolic and diastolic blood pressure.Conclusion. The closest relationship with vascular wall of the studied parameters was observed in patients with risk factors of cardiovasculardisease, and in the

  1. Methods for Engineering Enterprise Management Based on the Inter-factor Productive-Economic Relations

    Directory of Open Access Journals (Sweden)

    O. A. Naydis

    2015-01-01

    Full Text Available The article analyzes the current state of engineering enterprises in the Russian Federation. It conducts a review and analysis of existing methods for business management using indicators to characterize enterprise activities by means of the scalars, functional dependencies of one factor value on the other (function one, wherein the magnitude of one factor value corresponds to a single magnitude of the other value - a dependent factor, as well as by means of data tables, and, as an example, by balance list and articulation statement used in accounting. The paper gives statements of need for taking into account the mutual influences and system interrelation of factors diversity and for special methods of their identification. The article is aimed at development of methods for business management of engineering enterprises taking into account a variety of factors and their interdependencies. The relevance of the issue stems from the fact that the analysis of existing methods of business management has shown that it is impossible to have the requested information about a considerable number of productive-economic factors in their system-based interrelation. There is a proposal for the management objects wherein multiple factors and their interactions are taken into consideration to be called inter-factor productive-economic relations (IPER. The paper presents the IPER-based structure of the business management system. It describes a method to identify controlled productive-economic factors and provides allocation and justification of the significant ones for the IPER control. Described methods of business management are distinguished by a large amount of control information, and data form rather complex structures. Therefore, it is proposed to use them in automatic control systems. The paper describes principles of information support for business management through binding IPER to organizational structures of the enterprise. It offers an

  2. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  3. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  4. Franck-Condon Factors for Diatomics: Insights and Analysis Using the Fourier Grid Hamiltonian Method

    Science.gov (United States)

    Ghosh, Supriya; Dixit, Mayank Kumar; Bhattacharyya, S. P.; Tembe, B. L.

    2013-01-01

    Franck-Condon factors (FCFs) play a crucial role in determining the intensities of the vibrational bands in electronic transitions. In this article, a relatively simple method to calculate the FCFs is illustrated. An algorithm for the Fourier Grid Hamiltonian (FGH) method for computing the vibrational wave functions and the corresponding energy…

  5. Testing all six person-oriented principles in dynamic factor analysis.

    Science.gov (United States)

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  6. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    International Nuclear Information System (INIS)

    Shen, Chen-Hua

    2015-01-01

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  7. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Chen-Hua, E-mail: shenandchen01@163.com [College of Geographical Science, Nanjing Normal University, Nanjing 210046 (China); Jiangsu Center for Collaborative Innovation in Geographical Information Resource, Nanjing 210046 (China); Key Laboratory of Virtual Geographic Environment of Ministry of Education, Nanjing 210046 (China)

    2015-12-04

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  8. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Polyakov, P.Y.

    2016-01-01

    Roč. 27, č. 3 (2016), s. 538-550 ISSN 2162-237X R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:67985807 Keywords : associative memory * bars problem (BP) * Boolean factor analysis (BFA) * data mining * dimension reduction * Hebbian learning rule * information gain * likelihood maximization (LM) * neural network application * recurrent neural network * statistics Subject RIV: IN - Informatics, Computer Science Impact factor: 6.108, year: 2016

  9. An analysis of main factors in electron beam flue gas purification

    International Nuclear Information System (INIS)

    Zhang Ming; Xu Guang

    2003-01-01

    Electron beam flue gas purification method is developing very quickly in recent years. Based on the experiment setting for electron beam flue gas purification in Institute of Nuclear Energy and Technology, Tsinghua University, how the technique factors affect the ratio of desulphurization and denitrogenation are described. Radiation dose (D), temperature (T), humidity (H), pour ammonia quantity (α) and initial concentration of SO 2 (C SO 2 ) and NO x (C NO x ) are main factors influencing flue gas purification. Using the methods of correlation analysis and regression analysis, the primary effect factors are found out and the regression equations are set to optimize the system process, predigest the system structure and to forecast the experimental results. (authors)

  10. Confirmatory factor analysis using Microsoft Excel.

    Science.gov (United States)

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  11. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    kernel PCA. Shawe-Taylor and Cristianini [4] is an excellent reference for kernel methods in general. Bishop [5] and Press et al. [6] describe kernel methods among many other subjects. The kernel version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional...... feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...... of the kernel width. The 2,097 samples each covering on average 5 km2 are analyzed chemically for the content of 41 elements....

  12. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  13. Experimental analysis on removal factor of smear method in measurement of surface contamination

    International Nuclear Information System (INIS)

    Sugiura, Nobuyuki; Taira, Junichi; Takenaka, Keisuke; Yamanaka, Kazuo; Sugai, Kenji; Kosako, Toshiso

    2007-01-01

    The smear test is one of the important ways to measure surface contamination. The loose contamination under the high background radiation, which is more significant in handling non-sealed radioisotopes, can be evaluated by this method. The removal factor is defined as the ratio of the activity removed from the surface by one smear to the whole activity of the removable surface contamination. The removal factor is greatly changed by the quality and condition of surface materials. In this study, the values of removal factor at several typical surface conditions were evaluated experimentally and the practical application of those values was considered. It is required the smear should be pressed by moderate pressure when wiping the surface. The pressure from 1.0 kg to 1.5 kg per filter paper was recommended. The removal factor showed lower value in wiping by the pressure below 1.0 kg. The value of 0.5 for the removal factor could be applied to the smooth surface of linoleum, concrete coated with paint or epoxy resin, stainless steel and glass with the statistical allowance. (author)

  14. An Effective Method to Accurately Calculate the Phase Space Factors for β"-β"- Decay

    International Nuclear Information System (INIS)

    Horoi, Mihai; Neacsu, Andrei

    2016-01-01

    Accurate calculations of the electron phase space factors are necessary for reliable predictions of double-beta decay rates and for the analysis of the associated electron angular and energy distributions. We present an effective method to calculate these phase space factors that takes into account the distorted Coulomb field of the daughter nucleus, yet it allows one to easily calculate the phase space factors with good accuracy relative to the most exact methods available in the recent literature.

  15. A study of environmental polluting factors by neutron activation method

    International Nuclear Information System (INIS)

    Paunoiu, C.; Doca, C.

    2004-01-01

    The paper presents: a) some importance factors of the environmental pollution; b) the theoretical aspects of the Neutron Activation Analysis (NAA) used in the study of the environmental pollution; c) the NAA specific hardware and software facilities existing at the Institute for Nuclear Research; d) a direct application of the NAA method in the study of the environmental pollution for Pitesti city by the analysis of some ground and vegetation samples; e) results and conclusions. (authors)

  16. Novel Method of Production Decline Analysis

    Science.gov (United States)

    Xie, Shan; Lan, Yifei; He, Lei; Jiao, Yang; Wu, Yong

    2018-02-01

    ARPS decline curves is the most commonly used in oil and gas field due to its minimal data requirements and ease application. And prediction of production decline which is based on ARPS analysis rely on known decline type. However, when coefficient index are very approximate under different decline type, it is difficult to directly recognize decline trend of matched curves. Due to difficulties above, based on simulation results of multi-factor response experiments, a new dynamic decline prediction model is introduced with using multiple linear regression of influence factors. First of all, according to study of effect factors of production decline, interaction experimental schemes are designed. Based on simulated results, annual decline rate is predicted by decline model. Moreover, the new method is applied in A gas filed of Ordos Basin as example to illustrate reliability. The result commit that the new model can directly predict decline tendency without needing recognize decline style. From arithmetic aspect, it also take advantage of high veracity. Finally, the new method improves the evaluation method of gas well production decline in low permeability gas reservoir, which also provides technical support for further understanding of tight gas field development laws.

  17. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  18. The factorization method and supersymmetry

    International Nuclear Information System (INIS)

    Alves, N.A.; Drigo Filho, E.

    1988-01-01

    Applying the factorization method, we generalize the harmonic - oscillator and the Coulomb potentials, both in arbitrary dimensions. We also show that this method allows the determination of the superpotentials and the supersymmetric partners associated with each of those systems. (author) [pt

  19. Factors Influencing Acceptance Of Contraceptive Methods

    Directory of Open Access Journals (Sweden)

    Anita Gupta

    1997-04-01

    Full Text Available Research Problem: What are the factors influencing acceptance of contraceptive methods. Objective: To study the determinants influencing contra­ceptive acceptance. Study design: Population based cross - sectional study. Setting: Rural area of East Delhi. Participants: Married women in the reproductive age group. Sample:Stratified sampling technique was used to draw the sample. Sample Size: 328 married women of reproductive age group. Study Variables: Socio-economic status, Type of contraceptive, Family size, Male child. Outcome Variables: Acceptance of contraceptives Statistical Analysis: By proportions. Result: Prevalence of use of contraception at the time of data collection was 40.5%. Tubectomy and vasectomy were most commonly used methods. (59.4%, n - 133. Educational status of the women positively influenced the contraceptive acceptance but income did not. Desire for more children was single most important deterrent for accepting contraception. Recommendations: (i             Traditional method of contraception should be given more attention. (ii            Couplesshould be brought in the contraceptive use net at the early stage of marriage.

  20. Exploratory Factor Analysis With Small Samples and Missing Data.

    Science.gov (United States)

    McNeish, Daniel

    2017-01-01

    Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.

  1. Exploratory Bi-factor Analysis: The Oblique Case

    OpenAIRE

    Jennrich, Robert L.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bi-factor rotation criterion designed to produce a rotated loading mat...

  2. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    Science.gov (United States)

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  3. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  4. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  5. Exploring factors that influence work analysis data: A meta-analysis of design choices, purposes, and organizational context.

    Science.gov (United States)

    DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A

    2015-09-01

    Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).

  6. The Intersectoral Collaboration Document for Cancer Risk Factors Reduction: Method and Stakeholder Analysis

    Directory of Open Access Journals (Sweden)

    Ali-Asghar Kolahi

    2016-03-01

    Full Text Available Background and Objective: Cancers are one of the most important public health issues and the third leading cause of mortality after cardiovascular diseases and injuries in Iran. The most common cancers reported in the recent years have been included skin, stomach, breast, colon, bladder, leukemia, and esophagus respectively. Control of cancer as one of the three main health system priorities of Iran, needs a specific roadmap and clear task definition for involved organizations. This study provides stakeholder analysis include determining the roles of Ministry of Health and Medical Education as the custodian of the national health and the duties of other beneficiary organizations to reduce the risk of cancer for cooperation with a scientific approach and systematic methodology.Materials and Methods: This health system research project was performed by participation of Social Determinants of Health Research Center of Shahid Beheshti University of Medical Sciences, Office of the Non-Communicable Diseases of the Ministry of Health and Medical Education and other stakeholders in 2013. At first, the strategic committee was established and the stakeholders were identified and analyzed. Then the quantitative data were collected by searching in national database concern incidence, prevalence, and burden of all types of cancers. At the last with the qualitative approach, a systematic review of the studies, documents and reports was conducted as well as conversing for the national strategic plans of Iran and other countries and the experts’ views regarding management of the cancer risk factors. In practice, role and responsibilities of each stakeholder were practically analyzed. Then the risk factors identified and the effective evidence-based interventions were determined for each cancer and finally the role of the Ministry of Health were set as the responsible or co-worker and also the role of the other organizations separately clarified in each

  7. Exchange factor method: an alternative zonal formulation for analysis of radiating enclosures containing participating media

    International Nuclear Information System (INIS)

    Larsen, M.E.

    1983-01-01

    The exchange factor method (EFM) is introduced and compared to the zone method (ZM). In both the EFM and ZM the region of interest is discretized into volume and surface elements, each considered to be isothermal, which are small enough to give the required resolution. A suitable set of state variables for the system is composed of the surface element radiosities and the gas element emissive powers. The EFM defines exchange factors as dimensionless total-exchange areas for radiant interchange between volume and surface elements by all possible absorption/re-emission paths, but excluding wall reflections. In the EFM, the exchange factors replace the direct-exchange areas of the ZM and are used to write energy balances for each area and volume element in the system. As in the ZM, the radiant energy balance equations result in a set of algebraic equations linear in the system state variables. The distinguishing feature of the EFM is that exchange factors may be measurable quantities. Relationships between the EFM exchange factors and the ZM direct-exchange areas are presented. EFM conservation and reciprocity laws, analogous to those of the ZM, are also included. Temperature and heat flux distributions, predicted using the EFM, for two- and three-dimensional enclosures containing absorbing/emitting, isotropically scattering, and conducting media are included. An application of the EFM is proposed which calls for the measurement of exchange factors in a scale model of the enclosure to be analyzed. The measurement of these factors in an enclosure containing an isotropically scattering medium is discussed. The effects of isotropic scattering and absorption/re-emission processes are shown to be indistinguishable in their contribution to exchange factor paths

  8. Stability Analysis of Anchored Soil Slope Based on Finite Element Limit Equilibrium Method

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2016-01-01

    Full Text Available Under the condition of the plane strain, finite element limit equilibrium method is used to study some key problems of stability analysis for anchored slope. The definition of safe factor in slices method is generalized into FEM. The “true” stress field in the whole structure can be obtained by elastic-plastic finite element analysis. Then, the optimal search for the most dangerous sliding surface with Hooke-Jeeves optimized searching method is introduced. Three cases of stability analysis of natural slope, anchored slope with seepage, and excavation anchored slope are conducted. The differences in safety factor quantity, shape and location of slip surface, anchoring effect among slices method, finite element strength reduction method (SRM, and finite element limit equilibrium method are comparatively analyzed. The results show that the safety factor given by the FEM is greater and the unfavorable slip surface is deeper than that by the slice method. The finite element limit equilibrium method has high calculation accuracy, and to some extent the slice method underestimates the effect of anchor, and the effect of anchor is overrated in the SRM.

  9. Studies on the instrumental neutron activation analysis by cadmium ratio method and pair comparator method

    Energy Technology Data Exchange (ETDEWEB)

    Chao, H E; Lu, W D; Wu, S C

    1977-12-01

    The cadmium ratio method and pair comparator method provide a solution for the effects on the effective activation factors resulting from the variation of neutron spectrum at different irradiation positions as usually encountered in the single comparator method. The relations between the activation factors and neutron spectrum in terms of cadmium ratio of the comparator Au or of the activation factor of Co-Au pair for the elements, Sc, Cr, Mn, Co, La, Ce, Sm, and Th have been determined. The activation factors of the elements at any irradiation position can then be obtained from the cadmium ratio of the comparator and/or the activation factor of the comparator pair. The relations determined should be able to apply to different reactors and/or different positions of a reactor. It is shown that, for the isotopes /sup 46/Sc, /sup 51/Cr, /sup 56/Mn, /sup 60/Co, /sup 140/La, /sup 141/Ce, /sup 153/Sm and /sup 233/Pa, the thermal neutron activation factors determined by these two methods were generally in agreement with theoretical values. Their I/sub 0//sigma/sub th/ values appeared to agree with literature values also. The methods were applied to determine the contents of elements Sc, Cr, Mn, La, Ce, Sm, and Th in U.S.G.S. Standard Rock G-2, and the results were also in agreement with literature values. The cadmium ratio method and pair comparator method improved the single comparator method, and they are more suitable to analysis for multi-elements of a large number of samples.

  10. Factor analysis on hazards for safety assessment in decommissioning workplace of nuclear facilities using a semantic differential method

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan-Seong [Korea Atomic Energy Research Institute, 1045 Daedeok-daero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)], E-mail: ksjeongl@kaeri.re.kr; Lim, Hyeon-Kyo [Chungbuk National University, 410 Sungbong-ro, Heungduk-gu, Cheongju, Chungbuk 361-763 (Korea, Republic of)

    2009-10-15

    The decommissioning of nuclear facilities must be accomplished according to its structural conditions and radiological characteristics. An effective risk analysis requires basic knowledge about possible risks, characteristics of potential hazards, and comprehensive understanding of the associated cause-effect relationships within a decommissioning for nuclear facilities. The hazards associated with a decommissioning plan are important not only because they may be a direct cause of harm to workers but also because their occurrence may, indirectly, result in increased radiological and non-radiological hazards. Workers need to be protected by eliminating or reducing the radiological and non-radiological hazards that may arise during routine decommissioning activities as well as during accidents. Therefore, to prepare the safety assessment for decommissioning of nuclear facilities, the radiological and non-radiological hazards should be systematically identified and classified. With a semantic differential method of screening factor and risk perception factor, the radiological and non-radiological hazards are screened and identified.

  11. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    1994-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype, Experimental Breeder Reactor II. The uncertainty factors were applied to the channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A ''semistatistical horizontal method'' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived

  12. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    2004-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype. Experimental Breeder Reactor II. The uncertainty factors were applied to the hot channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A 'semistatistical horizontal method' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived. (author)

  13. Human factors methods in DOE nuclear facilities

    International Nuclear Information System (INIS)

    Bennett, C.T.; Banks, W.W.; Waters, R.J.

    1993-01-01

    The US Department of Energy (DOE) is in the process of developing a series of guidelines for the use of human factors standards, procedures, and methods to be used in nuclear facilities. This paper discusses the philosophy and process being used to develop a DOE human factors methods handbook to be used during the design cycle. The following sections will discuss: (1) basic justification for the project; (2) human factors design objectives and goals; and (3) role of human factors engineering (HFE) in the design cycle

  14. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  15. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    A description of the most frequently used approaches for human reliability assesment is given. The relation between different human factor causes for human induced events in Kozloduy NPP during the period 2000 - 2003 is discussed. A comparison between the contribution of the casual factors for events occurrences in Kozloduy NPP and an Japanese NPP is presented. It can be concluded that for both NPPs the most important casual factors are: 1) written procedures and documents; 2) man-machine interface; 3) environmental working conditions; 4) working practice; 5) training and qualification; 6) supervising methods

  16. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  17. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    Science.gov (United States)

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  18. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    Directory of Open Access Journals (Sweden)

    Leandro F. Malloy-Diniz

    2017-04-01

    Full Text Available Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale.Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a urgency, (b lack of premeditation; (c lack of perseverance; (d sensation seeking. In the present study 384 participants (278 women and 106 men, who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis.Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory.Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  19. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    Science.gov (United States)

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  20. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    Science.gov (United States)

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  1. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  2. Analysis of radiation and chemical factors which define the ecological situation of environment

    International Nuclear Information System (INIS)

    Trofimenko, A.P.

    1996-01-01

    A new method of large information set statistical analysis is proposed. It permits to define the main directions of work in a given field in the world or in a particular country, to find the most important investigated problems and to evaluate the role each of them quantitatively, as well as to study the dynamics of work development in time, the methods of research used, the centres in which this research is mostly developed, authors of publications etc. Statistical analysis may be supplemented with subject analysis of selected publications. Main factors which influence on different environment components and on public health are presented as an example of this method use, and the role of radiation and chemical factors is evaluated. 18 refs., 6 tab

  3. Factor analysis of processes of corporate culture formation at industrial enterprises of Ukraine

    Directory of Open Access Journals (Sweden)

    Illiashenko Sergii

    2016-06-01

    Full Text Available Authors have analyzed and synthesized the features of formation and development of the corporate culture at industrial enterprises of Ukraine and on this basis developed recommendations for application in the management of strategic development. During the research authors used the following general scientific methods: at research of patterns of interaction national culture, corporate culture and the culture of the individual authors used logical generalization method; for determining factors influencing corporate culture formation with the level of occurrence authors used factor analysis; for trend analysis of the corporate culture development at appropriate levels authors used comparative method. Results of the analysis showed that macro- and microfactors are external and mezofaktors (adaptability of business and corporate governance, corporate ethics, corporate social responsibility and personnel policies, corporate finance are internal for an enterprise. Authors have identified areas for each of the factors, itemized obstacles to the establishment and development of corporate culture at Ukrainian industrial enterprises and proposed recommendations for these processes management.

  4. A human factor analysis of a radiotherapy accident

    International Nuclear Information System (INIS)

    Thellier, S.

    2009-01-01

    Since September 2005, I.R.S.N. studies activities of radiotherapy treatment from the angle of the human and organizational factors to improve the reliability of treatment in radiotherapy. Experienced in nuclear industry incidents analysis, I.R.S.N. analysed and diffused in March 2008, for the first time in France, the detailed study of a radiotherapy accident from the angle of the human and organizational factors. The method used for analysis is based on interviews and documents kept by the hospital. This analysis aimed at identifying the causes of the difference recorded between the dose prescribed by the radiotherapist and the dose effectively received by the patient. Neither verbal nor written communication (intra-service meetings and protocols of treatment) allowed information to be transmitted correctly in order to permit radiographers to adjust the irradiation zones correctly. This analysis highlighted the fact that during the preparation and the carrying out of the treatment, various factors led planned controls to not be performed. Finally, this analysis highlighted the fact that unsolved areas persist in the report over this accident. This is due to a lack of traceability of a certain number of key actions. The article concluded that there must be improvement in three areas: cooperation between the practitioners, control of the actions and traceability of the actions. (author)

  5. An automated Monte-Carlo based method for the calculation of cascade summing factors

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, M.J., E-mail: mark.j.jackson@awe.co.uk; Britton, R.; Davies, A.V.; McLarty, J.L.; Goodwin, M.

    2016-10-21

    A versatile method has been developed to calculate cascade summing factors for use in quantitative gamma-spectrometry analysis procedures. The proposed method is based solely on Evaluated Nuclear Structure Data File (ENSDF) nuclear data, an X-ray energy library, and accurate efficiency characterisations for single detector counting geometries. The algorithm, which accounts for γ–γ, γ–X, γ–511 and γ–e{sup −} coincidences, can be applied to any design of gamma spectrometer and can be expanded to incorporate any number of nuclides. Efficiency characterisations can be derived from measured or mathematically modelled functions, and can accommodate both point and volumetric source types. The calculated results are shown to be consistent with an industry standard gamma-spectrometry software package. Additional benefits including calculation of cascade summing factors for all gamma and X-ray emissions, not just the major emission lines, are also highlighted. - Highlights: • Versatile method to calculate coincidence summing factors for gamma-spectrometry analysis. • Based solely on ENSDF format nuclear data and detector efficiency characterisations. • Enables generation of a CSF library for any detector, geometry and radionuclide. • Improves measurement accuracy and reduces acquisition times required to meet MDA.

  6. Application of activity theory to analysis of human-related accidents: Method and case studies

    International Nuclear Information System (INIS)

    Yoon, Young Sik; Ham, Dong-Han; Yoon, Wan Chul

    2016-01-01

    This study proposes a new approach to human-related accident analysis based on activity theory. Most of the existing methods seem to be insufficient for comprehensive analysis of human activity-related contextual aspects of accidents when investigating the causes of human errors. Additionally, they identify causal factors and their interrelationships with a weak theoretical basis. We argue that activity theory offers useful concepts and insights to supplement existing methods. The proposed approach gives holistic contextual backgrounds for understanding and diagnosing human-related accidents. It also helps identify and organise causal factors in a consistent, systematic way. Two case studies in Korean nuclear power plants are presented to demonstrate the applicability of the proposed method. Human Factors Analysis and Classification System (HFACS) was also applied to the case studies. The results of using HFACS were then compared with those of using the proposed method. These case studies showed that the proposed approach could produce a meaningful set of human activity-related contextual factors, which cannot easily be obtained by using existing methods. It can be especially effective when analysts think it is important to diagnose accident situations with human activity-related contextual factors derived from a theoretically sound model and to identify accident-related contextual factors systematically. - Highlights: • This study proposes a new method for analysing human-related accidents. • The method was developed based on activity theory. • The concept of activity system model and contradiction was used in the method. • Two case studies in nuclear power plants are presented. • The method is helpful to consider causal factors systematically and comprehensively.

  7. A continuous exchange factor method for radiative exchange in enclosures with participating media

    International Nuclear Information System (INIS)

    Naraghi, M.H.N.; Chung, B.T.F.; Litkouhi, B.

    1987-01-01

    A continuous exchange factor method for analysis of radiative exchange in enclosures is developed. In this method two types of exchange functions are defined, direct exchange function and total exchange function. Certain integral equations relating total exchange functions to direct exchange functions are developed. These integral equations are solved using Gaussian quadrature integration method. The results obtained based on the present approach are found to be more accurate than those of the zonal method

  8. Factor analysis with a priori knowledge - application in dynamic cardiac SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, A.; Di Bella, E.V.R.; Gullberg, G.T. [Medical Imaging Research Laboratory, Department of Radiology, University of Utah, CAMT, 729 Arapeen Drive, Salt Lake City, UT 84108-1218 (United States)

    2000-09-01

    Two factor analysis of dynamic structures (FADS) methods for the extraction of time-activity curves (TACs) from cardiac dynamic SPECT data sequences were investigated. One method was based on a least squares (LS) approach which was subject to positivity constraints. The other method was the well known apex-seeking (AS) method. A post-processing step utilizing a priori information was employed to correct for the non-uniqueness of the FADS solution. These methods were used to extract {sup 99m}Tc-teboroxime TACs from computer simulations and from experimental canine and patient studies. In computer simulations, the LS and AS methods, which are completely different algorithms, yielded very similar and accurate results after application of the correction for non-uniqueness. FADS-obtained blood curves correlated well with curves derived from region of interest (ROI) measurements in the experimental studies. The results indicate that the factor analysis techniques can be used for semi-automatic estimation of activity curves derived from cardiac dynamic SPECT images, and that they can be used for separation of physiologically different regions in dynamic cardiac SPECT studies. (author)

  9. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  10. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    Science.gov (United States)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  11. Combining analysis of variance and three‐way factor analysis methods for studying additive and multiplicative effects in sensory panel data

    DEFF Research Database (Denmark)

    Romano, Rosaria; Næs, Tormod; Brockhoff, Per Bruun

    2015-01-01

    Data from descriptive sensory analysis are essentially three‐way data with assessors, samples and attributes as the three ways in the data set. Because of this, there are several ways that the data can be analysed. The paper focuses on the analysis of sensory characteristics of products while...... in the use of the scale with reference to the existing structure of relationships between sensory descriptors. The multivariate assessor model will be tested on a data set from milk. Relations between the proposed model and other multiplicative models like parallel factor analysis and analysis of variance...

  12. Human-factors methods for assessing and enhancing power-plant maintainability

    International Nuclear Information System (INIS)

    Seminara, J.L.

    1982-05-01

    EPRI Final Report NP-1567, dated February 1981, presented the results of a human factors review of plant maintainability at nine power plants (five nuclear and four fossil). This investigation revealed a wide range of plant and equipment design features that can potentially compromise the effectiveness, safety, and productivity of maintenance personnel. The present study is an extension of the earlier work. It provides those utilities that did not participate in the original study with the methodological tools to conduct a review of maintenance provisions, facilities, and practices. This report describes and provides a self-review checklist; a structured interview; a task analysis approach; methods for reviewing maintenance errors or accidents; and recommended survey techniques for evaluating such factors as noise, illumination, and communications. Application of the human factors methods described in this report should reveal avenues for enhancing existing power plants from the maintainability and availability standpoints. This document may also serve a useful purpose for designers or reviewers of new plant designs or near-operational plants presently being constructed

  13. Factor Analysis on the Factors that Influencing Rural Environmental Pollution in the Hilly Area of Sichuan Province,China

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    By using factor analysis method and establishing analysis indicator system from four aspects including crop production,poultry farming,rural life and township enterprises,the difference,features,and types of factors influencing the rural environmental pollution in the hilly area in Sichuan Province,China.Results prove that the major factor influencing rural environmental pollution in the study area is livestock and poultry breeding,flowed by crop planting,rural life,and township enterprises.Hence future pollution prevention and control should set about from livestock and poultry breeding.Meanwhile,attention should be paid to the prevention and control of rural environmental pollution caused by rural life and township enterprise production.

  14. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    Moray, N.P.; Senders, J.W.; Rhodes, W.

    1985-06-01

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  15. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  16. Risk Analysis Method Based on FMEA for Transmission Line in Lightning Hazards

    Directory of Open Access Journals (Sweden)

    You-Yuan WANG

    2014-05-01

    Full Text Available Failure rate of transmission line and reliability of power system are significantly affected by Lightning meteorological factor. In view of the complexity and variability of Lightning meteorological factors, this paper presents lightning trip-out rate model of transmission line in considering distribution of ground flash density and lightning day hours. Meanwhile, presents a failure rate model of transmission line in different condition, and a risk analysis method for transmission line considering multiple risk factors based on risk quantification. This method takes Lightning meteorological factor as the main evaluation standard, and establishes risk degree evaluation system for transmission line including another five evaluation standard. Put forward the risk indicators by quantify the risk factors based on experience date of transmission line in service. Based on the risk indexes comprehensive evaluation is conducted, and the evaluation result closer to practice is achieved, providing basis for transmission line risk warning and maintenance strategy. Through the risk analysis for 220 kV transmission line in a certain power supply bureau, the effectiveness of the proposed method is validated.

  17. The Columbia Impairment Scale: Factor Analysis Using a Community Mental Health Sample

    Science.gov (United States)

    Singer, Jonathan B.; Eack, Shaun M.; Greeno, Catherine M.

    2011-01-01

    Objective: The objective of this study was to test the factor structure of the parent version of the Columbia Impairment Scale (CIS) in a sample of mothers who brought their children for community mental health (CMH) services (n = 280). Method: Confirmatory factor analysis (CFA) was used to test the fit of the hypothesized four-factor structure…

  18. A survey on critical factors influencing new advertisement methods

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-02-01

    Full Text Available Soft drink beverages are important part of many people’s foods and many prefer soft drink to water when they have dinner. Therefore, this business model can be considered as the longest lasting sector for many years and there has been not much change in these products. However, new methods of advertisement play important role for increasing market share. In this paper, we study the impact of new methods of advertisement in product development. The proposed study of this paper designs a questionnaire for one of Iranian soft drink producers, which consisted of 274 questions in Likert scale and uses factor analysis (FA to analyze the results. The study selects 250 people who live in city of Tehran, Iran and Cronbach alpha has been calculated as 0.88, which is well above the minimum desirable limit. According to our results, there were six important factors impacting in product development, including modern advertisement techniques, emotional impact, strategy of market leadership, pricing strategy, product life chain and supply entity. The most important factor loading in these six components include impact of social values, persuading unaware and uninformed customers, ability to monopolizing in production, improving pricing techniques, product life cycle and negative impact of high advertisement.

  19. Analysis of factors affecting the development of food crop varieties bred by mutation method in China

    International Nuclear Information System (INIS)

    Wang Zhidong; Hu Ruifa

    2002-01-01

    The research developed a production function on crop varieties developed by mutation method in order to explore factors affecting the development of new varieties. It is found that the research investment, human capital and radiation facilities were the most important factors that affected the development and cultivation area of new varieties through the mutation method. It is concluded that not all institutions involved in the breeding activities using mutation method must have radiation facilities and the national government only needed to invest in those key research institutes, which had strong research capacities. The saved research budgets can be used in the entrusting the institutes that have stronger research capacities with irradiating more breeding materials developed by the institutes that have weak research capacities, by which more opportunities to breed better varieties can be created

  20. A method to identify dependencies between organizational factors using statistical independence test

    International Nuclear Information System (INIS)

    Kim, Y.; Chung, C.H.; Kim, C.; Jae, M.; Jung, J.H.

    2004-01-01

    A considerable number of studies on organizational factors in nuclear power plants have been made especially in recent years, most of which have assumed organizational factors to be independent. However, since organizational factors characterize the organization in terms of safety and efficiency etc. and there would be some factors that have close relations between them. Therefore, from whatever point of view, if we want to identify the characteristics of an organization, the dependence relationships should be considered to get an accurate result. In this study the organization of a reference nuclear power plant in Korea was analyzed for the trip cases of that plant using 20 organizational factors that Jacobs and Haber had suggested: 1) coordination of work, 2) formalization, 3) organizational knowledge, 4) roles and responsibilities, 5) external communication, 6) inter-department communications, 7) intra-departmental communications, 8) organizational culture, 9) ownership, 10) safety culture, 11) time urgency, 12) centralization, 13) goal prioritization, 14) organizational learning, 15) problem identification, 16) resource allocation, 17) performance evaluation, 18) personnel selection, 19) technical knowledge, and 20) training. By utilizing the results of the analysis, a method to identify the dependence relationships between organizational factors is presented. The statistical independence test for the analysis result of the trip cases is adopted to reveal dependencies. This method is geared to the needs to utilize many kinds of data that has been obtained as the operating years of nuclear power plants increase, and more reliable dependence relations may be obtained by using these abundant data

  1. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Nurhayati Ai

    2018-01-01

    Full Text Available Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread questionnaires to consumer, then from those questionnaires we identified 16 variables that needs to be considered on selecting antivirus software. This 16 variables then divided into 5 factors by using factor analysis method in SPSS software. These five factors are security, performance, internal, time and capacity. To rank those factors we spread questionnaires to 6 IT expert then the data is analyzed using AHP method. The result is that performance factors gained the highest rank from all of the other factors. Thus, consumer can select antivirus software by judging the variables in the performance factors. Those variables are software loading speed, user friendly, no excessive memory use, thorough scanning, and scanning virus fast and accurately.

  2. Analysis of related risk factors for pancreatic fistula after pancreaticoduodenectomy

    Directory of Open Access Journals (Sweden)

    Qi-Song Yu

    2016-08-01

    Full Text Available Objective: To explore the related risk factors for pancreatic fistula after pancreaticoduodenectomy to provide a theoretical evidence for effectively preventing the occurrence of pancreatic fistula. Methods: A total of 100 patients who were admitted in our hospital from January, 2012 to January, 2015 and had performed pancreaticoduodenectomy were included in the study. The related risk factors for developing pancreatic fistula were collected for single factor and Logistic multi-factor analysis. Results: Among the included patients, 16 had pancreatic fistula, and the total occurrence rate was 16% (16/100. The single-factor analysis showed that the upper abdominal operation history, preoperative bilirubin, pancreatic texture, pancreatic duct diameter, intraoperative amount of bleeding, postoperative hemoglobin, and application of somatostatin after operation were the risk factors for developing pancreatic fistula (P<0.05. The multi-factor analysis showed that the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin were the dependent risk factors for developing pancreatic fistula (OR=4.162, 6.104, 5.613, 4.034, P<0.05. Conclusions: The occurrence of pancreatic fistula after pancreaticoduodenectomy is closely associated with the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin; therefore, effective measures should be taken to reduce the occurrence of pancreatic fistula according to the patients’ own conditions.

  3. Environmental Performance in Countries Worldwide: Determinant Factors and Multivariate Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Gallego-Alvarez

    2014-11-01

    Full Text Available The aim of this study is to analyze the environmental performance of countries and the variables that can influence it. At the same time, we performed a multivariate analysis using the HJ-biplot, an exploratory method that looks for hidden patterns in the data, obtained from the usual singular value decomposition (SVD of the data matrix, to contextualize the countries grouped by geographical areas and the variables relating to environmental indicators included in the environmental performance index. The sample used comprises 149 countries of different geographic areas. The findings obtained from the empirical analysis emphasize that socioeconomic factors, such as economic wealth and education, as well as institutional factors represented by the style of public administration, in particular control of corruption, are determinant factors of environmental performance in the countries analyzed. In contrast, no effect on environmental performance was found for factors relating to the internal characteristics of a country or political factors.

  4. Speeding Fermat's factoring method

    Science.gov (United States)

    McKee, James

    A factoring method is presented which, heuristically, splits composite n in O(n^{1/4+epsilon}) steps. There are two ideas: an integer approximation to sqrt(q/p) provides an O(n^{1/2+epsilon}) algorithm in which n is represented as the difference of two rational squares; observing that if a prime m divides a square, then m^2 divides that square, a heuristic speed-up to O(n^{1/4+epsilon}) steps is achieved. The method is well-suited for use with small computers: the storage required is negligible, and one never needs to work with numbers larger than n itself.

  5. Development Instruments Through Confirmatory Factor Analysis (CFA in Appropriate Intensity Assessment

    Directory of Open Access Journals (Sweden)

    Ari Saptono

    2017-06-01

    Full Text Available The research aims to develop the valid and reliable measurement instruments of entrepreneurship intention in vocational secondary school students. Multi stage random sampling was used as the technique to determine sample (300 respondents. The research method used research and development with confirmatory factor analysis (CFA. Result of confirmatory factor analysis (CFA at the second order with robust maximum likelihood method shows that valid and reliable instrument with the acquisition value of loading factor is more than 0.5 (> 0,5 and a significance value of t is more than 1,96 (> 1,96. Reliability test results shows that the value of the combined construct reliability (CR of 0.97and a variance value extract (VE to 0.52 is greater than the limit of acceptance CR ≥ 0.70 and VE ≥ 0.50. The conclusion of the measurement instruments of entrepreneurship intention with three dimensions and 31 items met the standards of validity and reliability in accordance with the instrument development process.

  6. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    Science.gov (United States)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  7. Exploratory Bi-Factor Analysis: The Oblique Case

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  8. Human failure event analysis and precautionary methods and their application to reactor system

    International Nuclear Information System (INIS)

    Zhang Li; Huang Shudong; Wang Yiqun; Gao Wenyu; Zhang Jin

    2003-01-01

    Making use of human factor engineering, control science and safety science and adopting the method of systemically collection and doing research work factually, the authors analyze the problem and tendency of human factor science, the classification system, the formation, the quantitative appraisal, data collection and data bank, the effect and influence of organization management, the root cause analysis technology, and human error failure mode and effect and criticality analysis, the method and strategy of defense-in-depth for preventing human-initiated accident. The human factor accidents theory and mechanism are constructed. All of the above was successfully applied to Daya Bay Nuclear Power Station and Lingao Nuclear Power Station. (authors)

  9. Functional Parallel Factor Analysis for Functions of One- and Two-dimensional Arguments

    NARCIS (Netherlands)

    Choi, Ji Yeh; Hwang, Heungsun; Timmerman, Marieke

    Parallel factor analysis (PARAFAC) is a useful multivariate method for decomposing three-way data that consist of three different types of entities simultaneously. This method estimates trilinear components, each of which is a low-dimensional representation of a set of entities, often called a mode,

  10. Hierarchic Analysis Method to Evaluate Rock Burst Risk

    Directory of Open Access Journals (Sweden)

    Ming Ji

    2015-01-01

    Full Text Available In order to reasonably evaluate the risk of rock bursts in mines, the factors impacting rock bursts and the existing grading criterion on the risk of rock bursts were studied. By building a model of hierarchic analysis method, the natural factors, technology factors, and management factors that influence rock bursts were analyzed and researched, which determined the degree of each factor’s influence (i.e., weight and comprehensive index. Then the grade of rock burst risk was assessed. The results showed that the assessment level generated by the model accurately reflected the actual risk degree of rock bursts in mines. The model improved the maneuverability and practicability of existing evaluation criteria and also enhanced the accuracy and science of rock burst risk assessment.

  11. Analysis of factors controlling soil phosphorus loss with surface runoff in Huihe National Nature Reserve by principal component and path analysis methods.

    Science.gov (United States)

    He, Jing; Su, Derong; Lv, Shihai; Diao, Zhaoyan; Bu, He; Wo, Qiang

    2018-01-01

    Phosphorus (P) loss with surface runoff accounts for the P input to and acceleration of eutrophication of the freshwater. Many studies have focused on factors affecting P loss with surface runoff from soils, but rarely on the relationship among these factors. In the present study, rainfall simulation on P loss with surface runoff was conducted in Huihe National Nature Reserve, in Hulunbeier grassland, China, and the relationships between P loss with surface runoff, soil properties, and rainfall conditions were examined. Principal component analysis and path analysis were used to analyze the direct and indirect effects on P loss with surface runoff. The results showed that P loss with surface runoff was closely correlated with soil electrical conductivity, soil pH, soil Olsen P, soil total nitrogen (TN), soil total phosphorus (TP), and soil organic carbon (SOC). The main driving factors which influenced P loss with surface runoff were soil TN, soil pH, soil Olsen P, and soil water content. Path analysis and determination coefficient analysis indicated that the standard multiple regression equation for P loss with surface runoff and each main factor was Y = 7.429 - 0.439 soil TN - 6.834 soil pH + 1.721 soil Olsen-P + 0.183 soil water content (r = 0.487, p runoff. The effect of physical and chemical properties of undisturbed soils on P loss with surface runoff was discussed, and the soil water content and soil Olsen P were strongly positive influences on the P loss with surface runoff.

  12. The application of factor analysis for whole body gamma spectra work up

    Energy Technology Data Exchange (ETDEWEB)

    Ragan, P; Fueloep, M [Inst. of Preventive and Clinical Medicine, 83301 Bratislava (Slovakia). Dept. of Radiation Hygiene; Krnac, S [Slovak Technical Univ., 81219 Bratislava (Slovakia). Dept. of Nuclear Physics and Technology

    1996-12-31

    The results of whole body (WB) counting with small high purity germanium detector were presented. The scaling confirmation factor analysis (SCFA) method based on factorization of the response operator is very sensitive and for this application suitable method how to decrease limits of detection. The minimal detectable activity (MDA, for counting time of person 7200 s, background 58600 s and 99% confidence level) of detector usually used in our laboratory for WB counting (relative efficiency 61.8%) 18.5 Bq and MDA for the SCFA method for small detector 17.9 are very close. The use of SCFA method improves the sensitivity (MDA) by factor of 4.1 and the small detector is comparable in sensitivity with the larger one (J.K). 4 tabs., 5 figs., 3 refs.

  13. Development of three-dimensional ENRICHED FREE MESH METHOD and its application to crack analysis

    International Nuclear Information System (INIS)

    Suzuki, Hayato; Matsubara, Hitoshi; Ezawa, Yoshitaka; Yagawa, Genki

    2010-01-01

    In this paper, we describe a method for three-dimensional high accurate analysis of a crack included in a large-scale structure. The Enriched Free Mesh Method (EFMM) is a method for improving the accuracy of the Free Mesh Method (FMM), which is a kind of meshless method. First, we developed an algorithm of the three-dimensional EFMM. The elastic problem was analyzed using the EFMM and we find that its accuracy compares advantageously with the FMM, and the number of CG iterations is smaller. Next, we developed a method for calculating the stress intensity factor by employing the EFMM. The structure with a crack was analyzed using the EFMM, and the stress intensity factor was calculated by the developed method. The analysis results were very well in agreement with reference solution. It was shown that the proposed method is very effective in the analysis of the crack included in a large-scale structure. (author)

  14. Risk factors for radiation-induced hypothyroidism: A Literature-Based Meta-Analysis

    DEFF Research Database (Denmark)

    Vogelius, Ivan R; Bentzen, Søren; Maraldo, Maja V

    2011-01-01

    BACKGROUND: A systematic overview and meta-analysis of studies reporting data on hypothyroidism (HT) after radiation therapy was conducted to identify risk factors for development of HT. METHODS: Published studies were identified from the PubMed and Embase databases and by hand-searching published...... reviews. Studies allowing the extraction of odds ratios (OR) for HT in 1 or more of several candidate clinical risk groups were included. A meta-analysis of the OR for development of HT with or without each of the candidate risk factors was performed. Furthermore, studies allowing the extraction......% risk of HT at a dose of 45 Gy but with considerable variation in the dose response between studies. Chemotherapy and age were not associated with risk of HT in this analysis. CONCLUSIONS: Several clinical risk factors for HT were identified. The risk of HT increases with increasing radiation dose...

  15. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  16. A new cyber security risk evaluation method for oil and gas SCADA based on factor state space

    International Nuclear Information System (INIS)

    Yang, Li; Cao, Xiedong; Li, Jie

    2016-01-01

    Based on comprehensive analysis of the structure and the potential safety problem of oil and gas SCADA(Supervisor control and data acquisition) network, aiming at the shortcomings of traditional evaluation methods, combining factor state space and fuzzy comprehensive evaluation method, a new network security risk evaluation method of oil and gas SCADA is proposed. First of all, formal description of factor state space and its complete mathematical definition were presented; secondly, factor fuzzy evaluation steps were discussed; then, using analytic hierarchy method, evaluation index system for oil and gas SCADA system was established, the index weights of all factors were determined by two-two comparisons; structure design of three layers in reasoning machine was completed. Experiments and tests show that the proposed method is accurate, reliable and practical. Research results provide the template and the new method for the other industries.

  17. Dancoff factors with partial absorption in cluster geometry by the direct method

    International Nuclear Information System (INIS)

    Rodrigues, Leticia Jenisch; Leite, Sergio de Queiroz Bogado; Vilhena, Marco Tullio de; Bodmann, Bardo Ernest Josef

    2007-01-01

    Accurate analysis of resonance absorption in heterogeneous systems is essential in problems like criticality, breeding ratios and fuel depletion calculations. In compact arrays of fuel rods, resonance absorption is strongly affected by the Dancoff factor, defined in this study as the probability that a neutron emitted from the surface of a fuel element, enters another fuel element without any collision in the moderator or cladding. In the original WIMS code, Black Dancoff factors were computed in cluster geometry by the collision probability method, for each one of the symmetrically distinct fuel pin positions in the cell. Recent improvements to the code include a new routine (PIJM) that was created to incorporate a more efficient scheme for computing the collision matrices. In that routine, each system region is considered individually, minimizing convergence problems and reducing the number of neutron track lines required in the in-plane integrations of the Bickley functions for any given accuracy. In the present work, PIJM is extended to compute Grey Dancoff factors for two-dimensional cylindrical cells in cluster geometry. The effectiveness of the method is accessed by comparing Grey Dancoff factors as calculated by PIJM, with those available in the literature by the Monte Carlo method, for the irregular geometry of the Canadian CANDU37 assembly. Dancoff factors at five symmetrically distinct fuel pin positions are found in very good agreement with the literature results (author)

  18. Estimation of physiological parameters using knowledge-based factor analysis of dynamic nuclear medicine image sequences

    International Nuclear Information System (INIS)

    Yap, J.T.; Chen, C.T.; Cooper, M.

    1995-01-01

    The authors have previously developed a knowledge-based method of factor analysis to analyze dynamic nuclear medicine image sequences. In this paper, the authors analyze dynamic PET cerebral glucose metabolism and neuroreceptor binding studies. These methods have shown the ability to reduce the dimensionality of the data, enhance the image quality of the sequence, and generate meaningful functional images and their corresponding physiological time functions. The new information produced by the factor analysis has now been used to improve the estimation of various physiological parameters. A principal component analysis (PCA) is first performed to identify statistically significant temporal variations and remove the uncorrelated variations (noise) due to Poisson counting statistics. The statistically significant principal components are then used to reconstruct a noise-reduced image sequence as well as provide an initial solution for the factor analysis. Prior knowledge such as the compartmental models or the requirement of positivity and simple structure can be used to constrain the analysis. These constraints are used to rotate the factors to the most physically and physiologically realistic solution. The final result is a small number of time functions (factors) representing the underlying physiological processes and their associated weighting images representing the spatial localization of these functions. Estimation of physiological parameters can then be performed using the noise-reduced image sequence generated from the statistically significant PCs and/or the final factor images and time functions. These results are compared to the parameter estimation using standard methods and the original raw image sequences. Graphical analysis was performed at the pixel level to generate comparable parametric images of the slope and intercept (influx constant and distribution volume)

  19. Confirmatory factor analysis applied to the Force Concept Inventory

    Science.gov (United States)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  20. Necessary steps in factor analysis : Enhancing validation studies of educational instruments. The PHEEM applied to clerks as an example

    NARCIS (Netherlands)

    Schonrock-Adema, Johanna; Heijne-Penninga, Marjolein; van Hell, Elisabeth A.; Cohen-Schotanus, Janke

    2009-01-01

    Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis

  1. Risk factors for the undermined coal bed mining method

    Energy Technology Data Exchange (ETDEWEB)

    Arad, V. [Petrosani Univ., Petrosani (Romania). Dept. of Mining Engineering; Arad, S. [Petrosani Univ., Petrosani (Romania). Dept of Electrical Engineering

    2009-07-01

    The Romanian mining industry has been in a serious decline and is undergoing ample restructuring. Analyses of reliability and risk are most important during the early stages of a project in guiding the decision as to whether or not to proceed and in helping to establish design criteria. A technical accident occurred in 2008 at the Petrila coal mine involving an explosion during the exploitation of a coal seam. Over time a series of technical accidents, such as explosions and ignitions of methane gas, roof blowing phenomena or self-ignition of coal and hazard combustions have occurred. This paper presented an analysis of factors that led to this accident as well an analysis of factors related to the mining method. Specifically, the paper discussed the geomechanical characteristics of rocks and coal; the geodynamic phenomenon from working face 431; the spontaneous combustion phenomenon; gas accumulation; and the pressure and the height of the undermined coal bed. It was concluded that for the specific conditions encountered in Petrila colliery, the undermined bed height should be between 5 and 7 metres, depending on the geomechanic characteristics of coal and surrounding rocks. 8 refs., 1 tab., 3 figs.

  2. Fluvial facies reservoir productivity prediction method based on principal component analysis and artificial neural network

    Directory of Open Access Journals (Sweden)

    Pengyu Gao

    2016-03-01

    Full Text Available It is difficult to forecast the well productivity because of the complexity of vertical and horizontal developments in fluvial facies reservoir. This paper proposes a method based on Principal Component Analysis and Artificial Neural Network to predict well productivity of fluvial facies reservoir. The method summarizes the statistical reservoir factors and engineering factors that affect the well productivity, extracts information by applying the principal component analysis method and approximates arbitrary functions of the neural network to realize an accurate and efficient prediction on the fluvial facies reservoir well productivity. This method provides an effective way for forecasting the productivity of fluvial facies reservoir which is affected by multi-factors and complex mechanism. The study result shows that this method is a practical, effective, accurate and indirect productivity forecast method and is suitable for field application.

  3. Analysis of risk assessment methods for goods trucking

    Directory of Open Access Journals (Sweden)

    Yunyazova A.O.

    2018-04-01

    Full Text Available the article considers models of risk assessment that can be applied to cargo transportation, for forecasting possible damage in the form of financial and material costs in order to reduce the percentage of probability of their occurrence. The analysis of risk by the method «Criterion. Event. Rule" is represented. This method is based on the collection of information by various methods, assigning an assessment to the identified risks, ranking and formulating a report on the analysis. It can be carried out as a fully manual mechanical method of information collecting and performing calculations or can be brought to an automated level from data collection to the delivery of finished results (but in this case some nuances that could significantly influence the outcome of the analysis can be ignored. The expert method is of particular importance, since it relies directly on human experience. In this case, a special role is played by the human factor. The collection of information and the assigned assessments to risk groups depend on the extent to which experts agree on this issue. The smaller the fluctuations in the values ​​of the estimates of the experts, the more accurate and optimal the results will be.

  4. Scintigraphic measurement of the contractile activity of the gastric antrum using factor analysis

    International Nuclear Information System (INIS)

    Bergmann, H.; Hoebart, J.; Kugi, A.; Stacher, G.; Granser, G.V.

    1990-01-01

    The motor activity of the gastric antrum is difficult to record by manometric means and scintigraphic methods have proved unsatisfactory so far as no consistent relationship between antral contractile activity and gastric emptying rate could be detected. We investigated, using data recorded in 16 healthy human subjects after the ingestion of a semisolid standard meal, whether a newly developed method employing factor analysis would yield more meaningful and reproducible results. Factor analysis was applied to sequential scintigraphic images (3-s frame time) of gastric antrum. The computed factor images and the respective factor curves are representative of distinct dynamic structures of the antrum. From the more or less sinusoidal excursions of the factor curves, which exhibited the 3 cycles per minute frequency characteristic for the stomach, amplitude, frequency and propagation velocity of antral contractions can be calculated. The amplitudes of the factor curves were used to calculate a contraction index. This contraction index was found to be correlated significantly negatively with the gastric half-emptying time of the ingested meal. The employed factor analytical approach thus seems a promising tool to further investigate the role of antral contractility in the process of gastric emptying. (Authors)

  5. Assessment on the leakage hazard of landfill leachate using three-dimensional excitation-emission fluorescence and parallel factor analysis method.

    Science.gov (United States)

    Pan, Hongwei; Lei, Hongjun; Liu, Xin; Wei, Huaibin; Liu, Shufang

    2017-09-01

    A large number of simple and informal landfills exist in developing countries, which pose as tremendous soil and groundwater pollution threats. Early warning and monitoring of landfill leachate pollution status is of great importance. However, there is a shortage of affordable and effective tools and methods. In this study, a soil column experiment was performed to simulate the pollution status of leachate using three-dimensional excitation-emission fluorescence (3D-EEMF) and parallel factor analysis (PARAFAC) models. Sum of squared residuals (SSR) and principal component analysis (PCA) were used to determine the optimal components for PARAFAC. A one-way analysis of variance showed that the component scores of the soil column leachate were significant influenced by landfill leachate (plandfill to that of natural soil could be used to evaluate the leakage status of landfill leachate. Furthermore, a hazard index (HI) and a hazard evaluation standard were established. A case study of Kaifeng landfill indicated a low hazard (level 5) by the use of HI. In summation, HI is presented as a tool to evaluate landfill pollution status and for the guidance of municipal solid waste management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Convergence Improvement of Response Matrix Method with Large Discontinuity Factors

    International Nuclear Information System (INIS)

    Yamamoto, Akio

    2003-01-01

    In the response matrix method, a numerical divergence problem has been reported when extremely small or large discontinuity factors are utilized in the calculations. In this paper, an alternative response matrix formulation to solve the divergence problem is discussed, and properties of iteration matrixes are investigated through eigenvalue analyses. In the conventional response matrix formulation, partial currents between adjacent nodes are assumed to be discontinuous, and outgoing partial currents are converted into incoming partial currents by the discontinuity factor matrix. Namely, the partial currents of the homogeneous system (i.e., homogeneous partial currents) are treated in the conventional response matrix formulation. In this approach, the spectral radius of an iteration matrix for the partial currents may exceed unity when an extremely small or large discontinuity factor is used. Contrary to this, an alternative response matrix formulation using heterogeneous partial currents is discussed in this paper. In the latter approach, partial currents are assumed to be continuous between adjacent nodes, and discontinuity factors are directly considered in the coefficients of a response matrix. From the eigenvalue analysis of the iteration matrix for the one-group, one-dimensional problem, the spectral radius for the heterogeneous partial current formulation does not exceed unity even if an extremely small or large discontinuity factor is used in the calculation; numerical stability of the alternative formulation is superior to the conventional one. The numerical stability of the heterogeneous partial current formulation is also confirmed by the two-dimensional light water reactor core analysis. Since the heterogeneous partial current formulation does not require any approximation, the converged solution exactly reproduces the reference solution when the discontinuity factors are directly derived from the reference calculation

  7. Influencing Factors of Catering and Food Service Industry Based on Principal Component Analysis

    OpenAIRE

    Zi Tang

    2014-01-01

    Scientific analysis of influencing factors is of great importance for the healthy development of catering and food service industry. This study attempts to present a set of critical indicators for evaluating the contribution of influencing factors to catering and food service industry in the particular context of Harbin City, Northeast China. Ten indicators that correlate closely with catering and food service industry were identified and performed by the principal component analysis method u...

  8. Quantitative Evaluation of gamma-Spectrum Analysis Methods using IAEA Test Spectra

    DEFF Research Database (Denmark)

    Nielsen, Sven Poul

    1982-01-01

    A description is given of a γ-spectrum analysis method based on nonlinear least-squares fitting. The quality of the method is investigated by using statistical tests on the results from analyses of IAEA test spectra. By applying an empirical correction factor of 0.75 to the calculated peak-area u...

  9. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  10. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  11. Observer variation factor on advanced method for accurate, robust, and efficient spectral fitting of java based magnetic resonance user interface for MRS data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suk Jun [Dept. of Biomedical Laboratory Science, College of Health Science, Cheongju University, Cheongju (Korea, Republic of); Yu, Seung Man [Dept. of Radiological Science, College of Health Science, Gimcheon University, Gimcheon (Korea, Republic of)

    2016-06-15

    The purpose of this study was examined the measurement error factor on AMARES of jMRUI method for magnetic resonance spectroscopy (MRS) quantitative analysis by skilled and unskilled observer method and identified the reason of independent observers. The Point-resolved spectroscopy sequence was used to acquired magnetic resonance spectroscopy data of 10 weeks male Sprague-Dawley rat liver. The methylene protons ((-CH2-)n) of 1.3 ppm and water proton (H2O) of 4.7 ppm ratio was calculated by LCModel software for using the reference data. The seven unskilled observers were calculated total lipid (methylene/water) using the jMRUI AMARES technique twice every 1 week, and we conducted interclass correlation coefficient (ICC) statistical analysis by SPSS software. The inter-observer reliability (ICC) of Cronbach's alpha value was less than 0.1. The average value of seven observer's total lipid (0.096±0.038) was 50% higher than LCModel reference value. The jMRUI AMARES analysis method is need to minimize the presence of the residual metabolite by identified metabolite MRS profile in order to obtain the same results as the LCModel.

  12. Supercompressibility factor program. A new calculation method for real gas factors developed by the American Gas Association/Gas Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Luebbe, D.

    1987-07-01

    The innovative US calculation method for natural gas real gas factors is applicable to great pressure and temperature ranges and does not involve any restrictions as to the quality of natural gas. The results obtained for natural gas coming from Northern Germany or for imported natural gas are well consistent with actual measuring results. The model can therefore be applied as a rule for computing in a new technical recommendation and determine real gas factors whenever they are relevant to trading. The respective calculations must be preceded by a complete analysis characterizing the quality of gases. However, the new method allows for the alternative calculation of real gas factors on the basis of a small number of easily measurable factors (for example H/sub 0/, d, CO/sub 2/). This quality seams to be all the more attractive as it allows for an automatic translation of parametric sets at changing gas qualities which for the first time can manage without an expensive online gas chromatography or density translators, respectively.

  13. Application of Gray Relational Analysis Method in Comprehensive Evaluation on the Customer Satisfaction of Automobile 4S Enterprises

    Science.gov (United States)

    Cenglin, Yao

    The car sales enterprises could continuously boost sales and expand customer groups, an important method is to enhance the customer satisfaction. The customer satisfaction of car sales enterprises (4S enterprises) depends on many factors. By using the grey relational analysis method, we could perfectly combine various factors in terms of customer satisfaction. And through the vertical contrast, car sales enterprises could find specific factors which will improve customer satisfaction, thereby increase sales volume and benefits. Gray relational analysis method has become a kind of good method and means to analyze and evaluate the enterprises.

  14. The mathematical pathogenetic factors analysis of acute inflammatory diseases development of bronchopulmonary system among infants

    Directory of Open Access Journals (Sweden)

    G. O. Lezhenko

    2017-10-01

    Full Text Available The purpose. To study the factor structure and to establish the associative interaction of pathogenetic links of acute diseases development of the bronchopulmonary system in infants.Materials and methods. The examination group consisted of 59 infants (average age 13.8 ± 1.4 months sick with acute inflammatory bronchopulmonary diseases. Also we tested the level of 25-hydroxyvitamin D (25(ОНD, vitamin D-binding protein, hBPI, cathelicidin LL-37, ß1-defensins, lactoferrin in blood serum with the help of immunoenzymometric analysis. Selection of prognostically important pathogenetic factors of acute bronchopulmonary disease among infants was conducted using ROC-analysis. The procedure for classifying objects was carried out using Hierarchical Cluster Analysis by the method of Centroid-based clustering. Results. Based on the results of the ROC-analysis were selected 15 potential predictors of the development of acute inflammatory diseases of the bronchopulmonary system among infants. The factor analysis made it possible to determine the 6 main components . The biggest influence in the development of the disease was made by "the anemia factor", "the factor of inflammation", "the maternal factor", "the vitamin D supply factor", "the immune factor" and "the phosphorus-calcium exchange factor” with a factor load of more than 0.6. The performed procedure of hierarchical cluster analysis confirmed the initial role of immuno-inflammatory components. The conclusions. The highlighted factors allowed to define a group of parameters, that must be influenced to achieve a maximum effect in carrying out preventive and therapeutic measures. First of all, it is necessary to influence the "the anemia factor" and "the calcium exchange factor", as well as the "the vitamin D supply factor". In other words, to correct vitamin D deficiency and carry out measures aimed at preventing the development of anemia. The prevention and treatment of the pathological course of

  15. Novel method for on-road emission factor measurements using a plume capture trailer.

    Science.gov (United States)

    Morawska, L; Ristovski, Z D; Johnson, G R; Jayaratne, E R; Mengersen, K

    2007-01-15

    The method outlined provides for emission factor measurements to be made for unmodified vehicles driving under real world conditions at minimal cost. The method consists of a plume capture trailer towed behind a test vehicle. The trailer collects a sample of the naturally diluted plume in a 200 L conductive bag and this is delivered immediately to a mobile laboratory for subsequent analysis of particulate and gaseous emissions. The method offers low test turnaround times with the potential to complete much larger numbers of emission factor measurements than have been possible using dynamometer testing. Samples can be collected at distances up to 3 m from the exhaust pipe allowing investigation of early dilution processes. Particle size distribution measurements, as well as particle number and mass emission factor measurements, based on naturally diluted plumes are presented. A dilution profile relating the plume dilution ratio to distance from the vehicle tail pipe for a diesel passenger vehicle is also presented. Such profiles are an essential input for new mechanistic roadway air quality models.

  16. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    Energy Technology Data Exchange (ETDEWEB)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results.

  17. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    International Nuclear Information System (INIS)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-01-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results. (author)

  18. Using the method of judgement analysis to address variations in diagnostic decision making

    OpenAIRE

    Hancock, Helen C; Mason, James M; Murphy, Jerry J

    2012-01-01

    Abstract Background Heart failure is not a clear-cut diagnosis but a complex clinical syndrome with consequent diagnostic uncertainty. Judgment analysis is a method to help clinical teams to understand how they make complex decisions. The method of judgment analysis was used to determine the factors that influence clinicians' diagnostic decisions about heart failure. Methods Three consultants, three middle grade doctors, and two junior doctors each evaluated 45 patient scenarios. The main out...

  19. Influence factors analysis of water environmental quality of main rivers in Tianjin

    Science.gov (United States)

    Li, Ran; Bao, Jingling; Zou, Di; Shi, Fang

    2018-01-01

    According to the evaluation results of the water environment quality of main rivers in Tianjin in 1986-2015, this paper analyzed the current situation of water environmental quality of main rivers in Tianjin retrospectively, established the index system and multiple factors analysis through selecting factors influencing the water environmental quality of main rivers from the economy, industry and nature aspects with the combination method of principal component analysis and linear regression. The results showed that water consumption, sewage discharge and water resources were the main factors influencing the pollution of main rivers. Therefore, optimizing the utilization of water resources, improving utilization efficiency and reducing effluent discharge are important measures to reduce the pollution of surface water environment.

  20. Status of photonuclear method of analysis among other nuclear analytical methods and main fields of its application

    International Nuclear Information System (INIS)

    Burmistenko, Yu.N.

    1986-01-01

    Technical, organizational and economical aspects as applied to the field of application of photonuclear methods of analysis of substance composition are considered. As for the technical aspect, the most important factors are nuclear-physical characteristics of the elements under determination and the elements composing the sample matrix. As for the organizational aspect, the governing factor in a number of cases is the availability of an irradiation device in the close vicinity of the analytical laboratory. Studying the technical and organizational aspects while choosing the proper method one can obtain the main source data to perform feasibility studies of a nuclear analytical complex with this or that activation source. Therefore, the economical aspect is governing for the choice of the method

  1. Analysis of algae growth mechanism and water bloom prediction under the effect of multi-affecting factor.

    Science.gov (United States)

    Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin

    2017-03-01

    The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.

  2. Determining The Factors Affecting Fruit Hardness of Different Peach Types with Meta Analysis

    Directory of Open Access Journals (Sweden)

    Hande Küçükönder

    2014-09-01

    Full Text Available The aim of this study is to determine the factor effective in determining the hardness of Caterina, Suidring, Royal Glory and Tirrenia peach types using meta analysis. In the study, the impact force (Fi and the contact time (tc were detected and the impulse values (I that are expressed as independent variable in the area under the curve were calculated in the measurements performed using the technique of a low-mass lateral impactor multiplicated with peach. Using the theory of elasticity, the independent variables were determined as Fmax (maximum impact force, contact time (tmax, Fmax/tmax, 1/tmax, 1/tmax2,5, Fmax/tmax 1.25 and Fmax2.5 parameters. The correlation coefficient values showing the relationship between these parameters and the dependent variable Magness-Taylor force (MT were calculated and were combined with meta-analysis by using the Hunter-Schmid and Fisher’s Z methods. The Cohen’s classification criterion was used in evaluating the resulting mean effect size (combined correlation value and in determining its direction. As a result of the meta-analysis, the mean effect size according to Hunter-Schmid method was found 0.436 (0.371-0.497 positively directed in 95% confidence interval, while it was found 0.468 (0.390-0.545 according to Fisher’s Z method. The effect sizes in both methods were determined “mid-level” according to the Cohen’s classification. When the significance level of the studies was analyzed with the Z test, all of the ones that taken into the meta analysis has been found statistically significant. As a result of the meta analysis in this study evaluating the relationship of peach types with the fruit hardness, the mean effect size has been found to reach “strong level”. Consequently, “maximum shock acceleration” was found to be a more effective factor comparing to the other factors in determining the the fruit hardness according to the results of meta analysis applied in both methods.

  3. Optimization of cooling tower performance analysis using Taguchi method

    Directory of Open Access Journals (Sweden)

    Ramkumar Ramakrishnan

    2013-01-01

    Full Text Available This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N analysis, analysis of variance (ANOVA and regression were carried out in order to determine the effects of process parameters on cooling tower effectiveness and to identity optimal factor settings. Finally confirmation tests verified this reliability of Taguchi method for optimization of counter flow cooling tower performance with sufficient accuracy.

  4. PATH ANALYSIS OF RECORDING SYSTEM INNOVATION FACTORS AFFECTING ADOPTION OF GOAT FARMERS

    Directory of Open Access Journals (Sweden)

    S. Okkyla

    2014-09-01

    Full Text Available The objective of this study was to evaluate the path analysis of recording system innovation factorsaffecting adoption of goat farmers. This study was conducted from January to February 2014 inPringapus District, Semarang Regency by using survey method. For determining the location, this studyused purposive sampling method. The amount of respondents were determined by quota samplingmethod. Total respondents randomly chosed were 146 farmers. The data were descriptively andquantitatively analyzed by using path analysis of statistical package for the social science (SPSS 16.Independent variables in this study were internal factor, motivation, innovation characteristics,information source, and dependent variable was adoption. Analysis of linear regression showed thatthere was no significant effect of internal factor on adoption, so that it was important to use the trimmingmethod in path analysis. The result of path analysis showed that the influence of motivation, innovationcharacteristics and information source on adoption were 0.168; 0.720 and 0.09, respectively. Innovationcharacteristics were the greatest effect on adoption. In conclusion, by improving innovationcharacteristics of respondent through motivation and information source may significantly increase theadoption of recording system in goat farmers.

  5. Lithuanian Population Aging Factors Analysis

    Directory of Open Access Journals (Sweden)

    Agnė Garlauskaitė

    2015-05-01

    Full Text Available The aim of this article is to identify the factors that determine aging of Lithuania’s population and to assess the influence of these factors. The article shows Lithuanian population aging factors analysis, which consists of two main parts: the first describes the aging of the population and its characteristics in theoretical terms. Second part is dedicated to the assessment of trends that influence the aging population and demographic factors and also to analyse the determinants of the aging of the population of Lithuania. After analysis it is concluded in the article that the decline in the birth rate and increase in the number of emigrants compared to immigrants have the greatest impact on aging of the population, so in order to show the aging of the population, a lot of attention should be paid to management of these demographic processes.

  6. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    reliability analysis methods which purposely employed to gain a Cognitive Failure Probability (CFP value which can be conducted with basic and extended method. An application of basic method will result a general value of failure probability whereas a more specific CFP value for every task will be resulted when the extended method is utilized. This study showed that numbers of factors that shall be applied to mitigate error on grinding and welding sector are application of; adequacy of organization, adequacy of Man Machine Interface (MMI & operational support, availability of procedure /plans and adequacy of training and preparation. This study exhibites that planning has the highest erroneous value of cognitive aspect on grinding task (by CFP value of 0.3. Furthermore, CFP value of 0.18 of cognitive aspect is shown for execution on welding task. To summarize, this study suggests numerous method to trim cognitive erroneous value on grinding and welding work, which are by committing a periodical training, applying more detail work instruction and giving education to operate the equipment. Keywords: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error

  7. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z.; Xiao, T.; Li, D.

    2016-07-01

    Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. (Author)

  8. A Confirmatory Factor Analysis of the Structure of Abbreviated Math Anxiety Scale

    Directory of Open Access Journals (Sweden)

    Farahman Farrokhi

    2011-06-01

    Full Text Available "nObjective: The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Abbreviated Math Anxiety Scale (AMAS, proposed by Hopko, Mahadevan, Bare & Hunt. "nMethod: The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. The confirmatory factor analysis (CFA was carried out to determine the factor structures of the Persian version of AMAS. "nResults: As expected, the two-factor solution provided a better fit to the data than a single factor. Moreover, multi-group analyses showed that this two-factor structure was invariant across sex. Hence, AMAS provides an equally valid measure for use among college students. "nConclusions:  Brief AMAS demonstrates adequate reliability and validity. The AMAS scores can be used to compare symptoms of math anxiety between male and female students. The study both expands and adds support to the existing body of math anxiety literature.

  9. Search Strategy of Detector Position For Neutron Source Multiplication Method by Using Detected-Neutron Multiplication Factor

    International Nuclear Information System (INIS)

    Endo, Tomohiro

    2011-01-01

    In this paper, an alternative definition of a neutron multiplication factor, detected-neutron multiplication factor kdet, is produced for the neutron source multiplication method..(NSM). By using kdet, a search strategy of appropriate detector position for NSM is also proposed. The NSM is one of the practical subcritical measurement techniques, i.e., the NSM does not require any special equipment other than a stationary external neutron source and an ordinary neutron detector. Additionally, the NSM method is based on steady-state analysis, so that this technique is very suitable for quasi real-time measurement. It is noted that the correction factors play important roles in order to accurately estimate subcriticality from the measured neutron count rates. The present paper aims to clarify how to correct the subcriticality measured by the NSM method, the physical meaning of the correction factors, and how to reduce the impact of correction factors by setting a neutron detector at an appropriate detector position

  10. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  11. Effects of measurement errors on psychometric measurements in ergonomics studies: Implications for correlations, ANOVA, linear regression, factor analysis, and linear discriminant analysis.

    Science.gov (United States)

    Liu, Yan; Salvendy, Gavriel

    2009-05-01

    This paper aims to demonstrate the effects of measurement errors on psychometric measurements in ergonomics studies. A variety of sources can cause random measurement errors in ergonomics studies and these errors can distort virtually every statistic computed and lead investigators to erroneous conclusions. The effects of measurement errors on five most widely used statistical analysis tools have been discussed and illustrated: correlation; ANOVA; linear regression; factor analysis; linear discriminant analysis. It has been shown that measurement errors can greatly attenuate correlations between variables, reduce statistical power of ANOVA, distort (overestimate, underestimate or even change the sign of) regression coefficients, underrate the explanation contributions of the most important factors in factor analysis and depreciate the significance of discriminant function and discrimination abilities of individual variables in discrimination analysis. The discussions will be restricted to subjective scales and survey methods and their reliability estimates. Other methods applied in ergonomics research, such as physical and electrophysiological measurements and chemical and biomedical analysis methods, also have issues of measurement errors, but they are beyond the scope of this paper. As there has been increasing interest in the development and testing of theories in ergonomics research, it has become very important for ergonomics researchers to understand the effects of measurement errors on their experiment results, which the authors believe is very critical to research progress in theory development and cumulative knowledge in the ergonomics field.

  12. Application of instrumental neutron activation analysis and multivariate statistical methods to archaeological Syrian ceramics

    International Nuclear Information System (INIS)

    Bakraji, E. H.; Othman, I.; Sarhil, A.; Al-Somel, N.

    2002-01-01

    Instrumental neutron activation analysis (INAA) has been utilized in the analysis of thirty-seven archaeological ceramics fragment samples collected from Tal AI-Wardiate site, Missiaf town, Hamma city, Syria. 36 chemical elements were determined. These elemental concentrations have been processed using two multivariate statistical methods, cluster and factor analysis in order to determine similarities and correlation between the various samples. Factor analysis confirms that samples were correctly classified by cluster analysis. The results showed that samples can be considered to be manufactured using three different sources of raw material. (author)

  13. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  14. Factor Economic Analysis at Forestry Enterprises

    Directory of Open Access Journals (Sweden)

    M.Yu. Chik

    2018-03-01

    Full Text Available The article studies the importance of economic analysis according to the results of research of scientific works of domestic and foreign scientists. The calculation of the influence of factors on the change in the cost of harvesting timber products by cost items has been performed. The results of the calculation of the influence of factors on the change of costs on 1 UAH are determined using the full cost of sold products. The variable and fixed costs and their distribution are allocated that influences the calculation of the impact of factors on cost changes on 1 UAH of sold products. The paper singles out the general results of calculating the influence of factors on cost changes on 1 UAH of sold products. According to the results of the analysis, the list of reserves for reducing the cost of production at forest enterprises was proposed. The main sources of reserves for reducing the prime cost of forest products at forest enterprises are investigated based on the conducted factor analysis.

  15. Application of the AHP method to analyze the significance of the factors affecting road traffic safety

    Directory of Open Access Journals (Sweden)

    Justyna SORDYL

    2015-06-01

    Full Text Available Over the past twenty years, the number of vehicles registered in Poland has grown rapidly. At the same time, a relatively small increase in the length of the road network has been observed. As a result of the limited capacity of available infrastructure, it leads to significant congestion and to increase of the probability of road accidents. The overall level of road safety depends on many factors - the behavior of road users, infrastructure solutions and the development of automotive technology. Thus the detailed assessment of the importance of individual elements determining road safety is difficult. The starting point is to organize the factors by grouping them into categories which are components of the DVE system (driver - vehicle - environment. In this work, to analyze the importance of individual factors affecting road safety, the use of analytic hierarchy process method (AHP was proposed. It is one of the multi-criteria methods which allows us to perform hierarchical analysis of the decision process, by means of experts’ opinions. Usage of AHP method enabled us to evaluate and rank the factors affecting road safety. This work attempts to link the statistical data and surveys in significance analysis of the elements determining road safety.

  16. Inference algorithms and learning theory for Bayesian sparse factor analysis

    International Nuclear Information System (INIS)

    Rattray, Magnus; Sharp, Kevin; Stegle, Oliver; Winn, John

    2009-01-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  17. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rattray, Magnus; Sharp, Kevin [School of Computer Science, University of Manchester, Manchester M13 9PL (United Kingdom); Stegle, Oliver [Max-Planck-Institute for Biological Cybernetics, Tuebingen (Germany); Winn, John, E-mail: magnus.rattray@manchester.ac.u [Microsoft Research Cambridge, Roger Needham Building, Cambridge, CB3 0FB (United Kingdom)

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  18. Analysis of factors controlling sediment phosphorus flux potential of wetlands in Hulun Buir grassland by principal component and path analysis method.

    Science.gov (United States)

    He, Jing; Su, Derong; Lv, Shihai; Diao, Zhaoyan; Ye, Shengxing; Zheng, Zhirong

    2017-11-08

    Phosphorus (P) flux potential can predict the trend of phosphorus release from wetland sediments to water and provide scientific parameters for further monitoring and management for phosphorus flux from wetland sediments to overlying water. Many studies have focused on factors affecting sediment P flux potential in sediment-water interface, but rarely on the relationship among these factors. In the present study, experiment on sediment P flux potential in sediment-water interface was conducted in six wetlands in Hulun Buir grassland, China and the relationships among sediment P flux potential in sediment-water interface, sediment physical properties, and sediment chemical characteristics were examined. Principal component analysis and path analysis were used to discuss these data in correlation coefficient, direct, and indirect effects on sediment P flux potential in sediment-water interface. Results indicated that the major factors affecting sediment P flux potential in sediment-water interface were amount of organophosphate-degradation bacterium in sediment, Ca-P content, and total phosphorus concentrations. The factors of direct influence sediment P flux potential were sediment Ca-P content, Olsen-P content, SOC content, and sediment Al-P content. The indirect influence sediment P flux potential in sediment-water interface was sediment Olsen-P content, sediment SOC content, sediment Ca-P content, and sediment Al-P content. And the standard multiple regression describing the relationship between sediment P flux potential in sediment-water interface and its major effect factors was Y = 5.849 - 1.025X 1  - 1.995X 2  + 0.188X 3  - 0.282X 4 (r = 0.9298, p < 0.01, n = 96), where Y is sediment P flux potential in sediment-water interface, X 1 is sediment Ca-P content, X 2 is sediment Olsen-P content, X 3 is sediment SOC content, and X 4 is sediment Al-P content. Therefore, future research will focus on these sediment properties to analyze the

  19. Towards factor analysis exploration applied to positron emission tomography functional imaging for breast cancer characterization

    International Nuclear Information System (INIS)

    Rekik, W.; Ketata, I.; Sellami, L.; Ben slima, M.; Ben Hamida, A.; Chtourou, K.; Ruan, S.

    2011-01-01

    This paper aims to explore the factor analysis when applied to a dynamic sequence of medical images obtained using nuclear imaging modality, Positron Emission Tomography (PET). This latter modality allows obtaining information on physiological phenomena, through the examination of radiotracer evolution during time. Factor analysis of dynamic medical images sequence (FADMIS) estimates the underlying fundamental spatial distributions by factor images and the associated so-called fundamental functions (describing the signal variations) by factors. This method is based on an orthogonal analysis followed by an oblique analysis. The results of the FADMIS are physiological curves showing the evolution during time of radiotracer within homogeneous tissues distributions. This functional analysis of dynamic nuclear medical images is considered to be very efficient for cancer diagnostics. In fact, it could be applied for cancer characterization, vascularization as well as possible evaluation of response to therapy.

  20. Survival analysis and classification methods for forest fire size.

    Science.gov (United States)

    Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.

  1. Fast sweeping method for the factored eikonal equation

    Science.gov (United States)

    Fomel, Sergey; Luo, Songting; Zhao, Hongkai

    2009-09-01

    We develop a fast sweeping method for the factored eikonal equation. By decomposing the solution of a general eikonal equation as the product of two factors: the first factor is the solution to a simple eikonal equation (such as distance) or a previously computed solution to an approximate eikonal equation. The second factor is a necessary modification/correction. Appropriate discretization and a fast sweeping strategy are designed for the equation of the correction part. The key idea is to enforce the causality of the original eikonal equation during the Gauss-Seidel iterations. Using extensive numerical examples we demonstrate that (1) the convergence behavior of the fast sweeping method for the factored eikonal equation is the same as for the original eikonal equation, i.e., the number of iterations for the Gauss-Seidel iterations is independent of the mesh size, (2) the numerical solution from the factored eikonal equation is more accurate than the numerical solution directly computed from the original eikonal equation, especially for point sources.

  2. Analysis of Factors That Affects the Investors in Conducting Business in Indonesia

    Directory of Open Access Journals (Sweden)

    Rini Kurnia Sari

    2015-10-01

    Full Text Available Investment is needed in the development of the economy. With the decentralization of investment is expected to evolve as a whole in every province in Indonesia. Local governments need to improve the quality of economic (GDP / Capita, social (HDI and the infrastructure to attract domestic and foreign investors. Fromthe test results showed that factors affecting investors conducting business in Indonesia is still influenced by GDP/capita, HDI and Infrastructure instead of natural resources.This study uses descriptive analysis and correlation analysis methods to look at the correlation factors that affect investors doing business in Indonesia.

  3. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  4. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    Science.gov (United States)

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  5. Spatial epidemiology of cancer: a review of data sources, methods and risk factors

    Directory of Open Access Journals (Sweden)

    Rita Roquette

    2017-05-01

    Full Text Available Cancer is a major concern among chronic diseases today. Spatial epidemiology plays a relevant role in this matter and we present here a review of this subject, including a discussion of the literature in terms of the level of geographic data aggregation, risk factors and methods used to analyse the spatial distribution of patterns and spatial clusters. For this purpose, we performed a websearch in the Pubmed and Web of Science databases including studies published between 1979 and 2015. We found 180 papers from 63 journals and noted that spatial epidemiology of cancer has been addressed with more emphasis during the last decade with research based on data mostly extracted from cancer registries and official mortality statistics. In general, the research questions present in the reviewed papers can be classified into three different sets: i analysis of spatial distribution of cancer and/or its temporal evolution; ii risk factors; iii development of data analysis methods and/or evaluation of results obtained from application of existing methods. This review is expected to help promote research in this area through the identification of relevant knowledge gaps. Cancer’s spatial epidemiology represents an important concern, mainly for public health policies design aimed to minimise the impact of chronic disease in specific populations.

  6. Meta-Analysis of Comparing Personal and Environmental Factors Effective in Addiction Relapse (Iran, 2004 -2012

    Directory of Open Access Journals (Sweden)

    s Safari

    2014-12-01

    Full Text Available Objective: This As a meta-analysis, this study aimed to integrate different studies and investigate the impact of individual and environmental factors on the reappearance of addiction in quitted people. Method: This study is a meta-analysis which uses Hunter and Schmidt approach. For this purpose, 28 out of 42 studies enjoying acceptable methodologies were selected, upon which the meta-analysis was conducted. A meta-analysis checklist was the research instrument. Using summary of the study results, the researcher manually calculated effect size and interpreted it based on the meta-analysis approach and Cohen’s table. Findings: Results revealed that the effect size of environmental factors on addiction relapse was 0.64 while it was obtained 0.41 for individual factors on addiction relapse. Conclusion: According to Cohen’s table, the effect sizes are evaluated as moderate and high for individual factors and environmental factors on addiction relapse, respectively.

  7. Slope stability analysis using limit equilibrium method in nonlinear criterion.

    Science.gov (United States)

    Lin, Hang; Zhong, Wenwen; Xiong, Wei; Tang, Wenyu

    2014-01-01

    In slope stability analysis, the limit equilibrium method is usually used to calculate the safety factor of slope based on Mohr-Coulomb criterion. However, Mohr-Coulomb criterion is restricted to the description of rock mass. To overcome its shortcomings, this paper combined Hoek-Brown criterion and limit equilibrium method and proposed an equation for calculating the safety factor of slope with limit equilibrium method in Hoek-Brown criterion through equivalent cohesive strength and the friction angle. Moreover, this paper investigates the impact of Hoek-Brown parameters on the safety factor of slope, which reveals that there is linear relation between equivalent cohesive strength and weakening factor D. However, there are nonlinear relations between equivalent cohesive strength and Geological Strength Index (GSI), the uniaxial compressive strength of intact rock σ ci , and the parameter of intact rock m i . There is nonlinear relation between the friction angle and all Hoek-Brown parameters. With the increase of D, the safety factor of slope F decreases linearly; with the increase of GSI, F increases nonlinearly; when σ ci is relatively small, the relation between F and σ ci is nonlinear, but when σ ci is relatively large, the relation is linear; with the increase of m i , F decreases first and then increases.

  8. Factors Affecting Green Residential Building Development: Social Network Analysis

    Directory of Open Access Journals (Sweden)

    Xiaodong Yang

    2018-05-01

    Full Text Available Green residential buildings (GRBs are one of the effective practices of energy saving and emission reduction in the construction industry. However, many real estate developers in China are less willing to develop GRBs, because of the factors affecting green residential building development (GRBD. In order to promote the sustainable development of GRBs in China, this paper, based on the perspective of real estate developers, identifies the influential and critical factors affecting GRBD, using the method of social network analysis (SNA. Firstly, 14 factors affecting GRBD are determined from 64 preliminary factors of three main elements, and the framework is established. Secondly, the relationships between the 14 factors are analyzed by SNA. Finally, four critical factors for GRBD, which are on the local economy development level, development strategy and innovation orientation, developer’s acknowledgement and positioning for GRBD, and experience and ability for GRBD, are identified by the social network centrality test. The findings illustrate the key issues that affect the development of GRBs, and provide references for policy making by the government and strategy formulation by real estate developers.

  9. Determinants of investment behaviour. Methods and applications of meta-analysis

    International Nuclear Information System (INIS)

    Koetse, M.J.

    2006-01-01

    Meta-analysis is gradually gaining ground in economics as a research method to objectively and quantitatively summarise a body of existing empirical evidence. This dissertation studies the performance of well-known meta-analytic models and presents two meta-analysis applications. Despite its many attractive features, meta-analysis faces several methodical difficulties, especially when applied in economic research. We investigate two specific methodical problems that any meta-analysis in economics will have to deal with, viz., systematic effect-size variation due to primary-study misspecifications, and random effect-size heterogeneity. Using Monte-Carlo analysis we investigate the effects of these methodical problems on the results of a meta-analysis, and study the small-sample properties of several well-known and often applied meta-estimators. The focus of the meta-analysis applications is on two topics that are relevant for understanding investment behaviour, viz., the impact of uncertainty on investment spending, and the potential for substitution of capital for energy in production processes. In the first application we aim to shed light on the direction of the relationship between investment and uncertainty, and to uncover which factors are empirically relevant for explaining the wide variety in study outcomes. In the second application our goal is to analyse the direction and magnitude of capital-energy substitution potential, and to analyse the empirical relevance of suggested sources of variation in elasticity estimates

  10. Determinants of investment behaviour. Methods and applications of meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koetse, M.J.

    2006-03-14

    Meta-analysis is gradually gaining ground in economics as a research method to objectively and quantitatively summarise a body of existing empirical evidence. This dissertation studies the performance of well-known meta-analytic models and presents two meta-analysis applications. Despite its many attractive features, meta-analysis faces several methodical difficulties, especially when applied in economic research. We investigate two specific methodical problems that any meta-analysis in economics will have to deal with, viz., systematic effect-size variation due to primary-study misspecifications, and random effect-size heterogeneity. Using Monte-Carlo analysis we investigate the effects of these methodical problems on the results of a meta-analysis, and study the small-sample properties of several well-known and often applied meta-estimators. The focus of the meta-analysis applications is on two topics that are relevant for understanding investment behaviour, viz., the impact of uncertainty on investment spending, and the potential for substitution of capital for energy in production processes. In the first application we aim to shed light on the direction of the relationship between investment and uncertainty, and to uncover which factors are empirically relevant for explaining the wide variety in study outcomes. In the second application our goal is to analyse the direction and magnitude of capital-energy substitution potential, and to analyse the empirical relevance of suggested sources of variation in elasticity estimates.

  11. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    Science.gov (United States)

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each

  12. Analysis of stress intensity factors for a new mechanical corrosion specimen

    International Nuclear Information System (INIS)

    Rassineux, B.; Crouzet, D.; Le Hong, S.

    1996-03-01

    Electricite de France is conducting a research program to determine corrosion cracking rates in the steam generators Alloy 600 tubes of the primary system. The objective is to correlate the cracking rates with the specimen stress intensity factor K I . One of the samples selected for the purpose of this study is the longitudinal notched specimen TEL (TEL: ''Tubulaire a Entailles Longitudinales''). This paper presents the analysis of the stress intensity factor and its experimental validation. The stress intensity factor has been evaluated for different loads using 3D finite element calculations with the Hellen-Parks and G(θ) methods. Both crack initiation and propagation are considered. As an assessment of the method, the numerical simulations are in good agreement with the fatigue crack growth rates measured experimentally for TEL and compact tension (CT) specimens. (authors). 8 refs., 6 figs., 2 tabs

  13. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    Full Text Available The interpretation of the several variables that contribute to defining milk quality is difficult due to the high degree of  correlation among them. In this case, one of the best methods of statistical processing is factor analysis, which belongs  to the multivariate groups; for our study this particular statistical approach was employed.  A total of 1485 individual goat milk samples from 117 Girgentana goats, were collected fortnightly from January to July,  and analysed for physical and chemical composition, and clotting properties. Milk pH and tritable acidity were within the  normal range for fresh goat milk. Morning milk yield resulted 704 ± 323 g with 3.93 ± 1.23% and 3.48±0.38% for fat  and protein percentages, respectively. The milk urea content was 43.70 ± 8.28 mg/dl. The clotting ability of Girgentana  milk was quite good, with a renneting time equal to 16.96 ± 3.08 minutes, a rate of curd formation of 2.01 ± 1.63 min-  utes and a curd firmness of 25.08 ± 7.67 millimetres.  Factor analysis was performed by applying axis orthogonal rotation (rotation type VARIMAX; the analysis grouped the  milk components into three latent or common factors. The first, which explained 51.2% of the total covariance, was  defined as “slow milks”, because it was linked to r and pH. The second latent factor, which explained 36.2% of the total  covariance, was defined as “milk yield”, because it is positively correlated to the morning milk yield and to the urea con-  tent, whilst negatively correlated to the fat percentage. The third latent factor, which explained 12.6% of the total covari-  ance, was defined as “curd firmness,” because it is linked to protein percentage, a30 and titatrable acidity. With the aim  of evaluating the influence of environmental effects (stage of kidding, parity and type of kidding, factor scores were anal-  ysed with the mixed linear model. Results showed significant effects of the season of

  14. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  15. Factors of Selection of the Stock Allocation Method

    Directory of Open Access Journals (Sweden)

    Rohov Heorhii K.

    2014-03-01

    Full Text Available The article describes results of the author’s study of factors of making strategic decisions on selection of methods of stock allocation by public joint stock companies in Ukraine. The author used the Random forest mathematical apparatus of classification trees building and also informal methods. The article analyses the reasons that restrain public allocation of stock. It shows significant influence upon selection of a method of stock allocation of such factors as capital concentration, balance rate of corporate rights, sector of economy and significant participation of the institutes of common investment or the state in the authorised capital. The built hierarchical model of classification of factors of the issuing policy of joint stock companies finds logical justification in specific features of the institutional environment, however, it does not fit into the framework of the classical concept of the market economy. The model could be used both for formation of goals of corporate financial strategies and in the process of improvement of state regulation of activity of securities issuers. The prospect of further studies in this direction is identification of transformation of factors of selection of the stock allocation method under conditions of revival of the stock market.

  16. Human factors estimation methods using physiological informations

    International Nuclear Information System (INIS)

    Takano, Ken-ichi; Yoshino, Kenji; Nakasa, Hiroyasu

    1984-01-01

    To enhance the operational safety in the nuclear power plant, it is necessary to decrease abnormal phenomena due to human errors. Especially, it is essential to basically understand human behaviors under the work environment for plant maintenance workers, inspectors, and operators. On the above stand point, this paper presents the results of literature survey on the present status of human factors engineering technology applicable to the nuclear power plant and also discussed the following items: (1) Application fields where the ergonomical evaluation is needed for workers safety. (2) Basic methodology for investigating the human performance. (3) Features of the physiological information analysis among various types of ergonomical techniques. (4) Necessary conditions for the application of in-situ physiological measurement to the nuclear power plant. (5) Availability of the physiological information analysis. (6) Effectiveness of the human factors engineering methodology, especially physiological information analysis in the case of application to the nuclear power plant. The above discussions lead to the demonstration of high applicability of the physiological information analysis to nuclear power plant, in order to improve the work performance. (author)

  17. Benefit Evaluation of Wind Turbine Generators in Wind Farms Using Capacity-Factor Analysis and Economic-Cost Methods

    DEFF Research Database (Denmark)

    Chen, Zhe; Wang, L.; Yeh, T-H.

    2009-01-01

    Due to the recent price spike of the international oil and the concern of global warming, the development and deployment of renewable energy become one of the most important energy policies around the globe. Currently, there are different capacities and hub heights for commercial wind turbine gen...... height for WTGs that have been installed in Taiwan. Important outcomes affecting wind cost of energy in comparison with economic results using the proposed economic-analysis methods for different WFs are also presented.......Due to the recent price spike of the international oil and the concern of global warming, the development and deployment of renewable energy become one of the most important energy policies around the globe. Currently, there are different capacities and hub heights for commercial wind turbine...... generators (WTGs). To fully capture wind energy, different wind farms (WFs) should select adequate capacity of WTGs to effectively harvest wind energy and maximize their economic benefit. To establish selection criterion, this paper first derives the equations for capacity factor (CF) and pairing performance...

  18. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    Directory of Open Access Journals (Sweden)

    Ye-Mao Xia

    2016-01-01

    Full Text Available Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. This leads to difficulty in dealing with outliers and/or distribution deviations. In this paper, we propose a Bayesian semiparametric modeling for factor analysis model with continuous and ordinal variables. A truncated stick-breaking prior is used to model the distributions of the intercept and/or covariance structural parameters. Bayesian posterior analysis is carried out through the simulation-based method. Blocked Gibbs sampler is implemented to draw observations from the complicated posterior. For model selection, the logarithm of pseudomarginal likelihood is developed to compare the competing models. Empirical results are presented to illustrate the application of the methodology.

  19. A retrospective analysis to identify the factors affecting infection in patients undergoing chemotherapy.

    Science.gov (United States)

    Park, Ji Hyun; Kim, Hyeon-Young; Lee, Hanna; Yun, Eun Kyoung

    2015-12-01

    This study compares the performance of the logistic regression and decision tree analysis methods for assessing the risk factors for infection in cancer patients undergoing chemotherapy. The subjects were 732 cancer patients who were receiving chemotherapy at K university hospital in Seoul, Korea. The data were collected between March 2011 and February 2013 and were processed for descriptive analysis, logistic regression and decision tree analysis using the IBM SPSS Statistics 19 and Modeler 15.1 programs. The most common risk factors for infection in cancer patients receiving chemotherapy were identified as alkylating agents, vinca alkaloid and underlying diabetes mellitus. The logistic regression explained 66.7% of the variation in the data in terms of sensitivity and 88.9% in terms of specificity. The decision tree analysis accounted for 55.0% of the variation in the data in terms of sensitivity and 89.0% in terms of specificity. As for the overall classification accuracy, the logistic regression explained 88.0% and the decision tree analysis explained 87.2%. The logistic regression analysis showed a higher degree of sensitivity and classification accuracy. Therefore, logistic regression analysis is concluded to be the more effective and useful method for establishing an infection prediction model for patients undergoing chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  1. Fatigue Analysis of Tubesheet/Shell Juncture Applying the Mitigation Factor for Over-conservatism

    International Nuclear Information System (INIS)

    Kang, Deog Ji; Kim, Kyu Hyoung; Lee, Jae Gon

    2009-01-01

    If the environmental fatigue requirements are applied to the primary components of a nuclear power plant, to which the present ASME Code fatigue curves are applied, some locations with high level CUF (Cumulative Usage Factor) are anticipated not to meet the code criteria. The application of environmental fatigue damage is still particularly controversial for plants with 60-year design lives. Therefore, it is need to develop a detailed fatigue analysis procedure to identify the conservatisms in the procedure and to lower the cumulative usage factor. Several factors are being considered to mitigate the conservatism such as three-dimensional finite element modeling. In the present analysis, actual pressure transient data instead of conservative maximum and minimum pressure data was applied as one of mitigation factors. Unlike in the general method, individual transient events were considered instead of the grouped transient events. The tubesheet/shell juncture in the steam generator assembly is the one of the weak locations and was, therefore, selected as a target to evaluate the mitigation factor in the present analysis

  2. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  3. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  4. Research on Human-Error Factors of Civil Aircraft Pilots Based On Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Guo Yundong

    2018-01-01

    Full Text Available In consideration of the situation that civil aviation accidents involve many human-error factors and show the features of typical grey systems, an index system of civil aviation accident human-error factors is built using human factor analysis and classification system model. With the data of accidents happened worldwide between 2008 and 2011, the correlation between human-error factors can be analyzed quantitatively using the method of grey relational analysis. Research results show that the order of main factors affecting pilot human-error factors is preconditions for unsafe acts, unsafe supervision, organization and unsafe acts. The factor related most closely with second-level indexes and pilot human-error factors is the physical/mental limitations of pilots, followed by supervisory violations. The relevancy between the first-level indexes and the corresponding second-level indexes and the relevancy between second-level indexes can also be analyzed quantitatively.

  5. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

    International Nuclear Information System (INIS)

    Wilpert, B.; Maimer, H.; Loroff, C.

    2000-01-01

    The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

  6. Estimation of Peaking Factor Uncertainty due to Manufacturing Tolerance using Statistical Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Park, Ho Jin; Lee, Chung Chan; Cho, Jin Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The purpose of this paper is to study the effect on output parameters in the lattice physics calculation due to the last input uncertainty such as manufacturing deviations from nominal value for material composition and geometric dimensions. In a nuclear design and analysis, the lattice physics calculations are usually employed to generate lattice parameters for the nodal core simulation and pin power reconstruction. These lattice parameters which consist of homogenized few-group cross-sections, assembly discontinuity factors, and form-functions can be affected by input uncertainties which arise from three different sources: 1) multi-group cross-section uncertainties, 2) the uncertainties associated with methods and modeling approximations utilized in lattice physics codes, and 3) fuel/assembly manufacturing uncertainties. In this paper, data provided by the light water reactor (LWR) uncertainty analysis in modeling (UAM) benchmark has been used as the manufacturing uncertainties. First, the effect of each input parameter has been investigated through sensitivity calculations at the fuel assembly level. Then, uncertainty in prediction of peaking factor due to the most sensitive input parameter has been estimated using the statistical sampling method, often called the brute force method. For our analysis, the two-dimensional transport lattice code DeCART2D and its ENDF/B-VII.1 based 47-group library were used to perform the lattice physics calculation. Sensitivity calculations have been performed in order to study the influence of manufacturing tolerances on the lattice parameters. The manufacturing tolerance that has the largest influence on the k-inf is the fuel density. The second most sensitive parameter is the outer clad diameter.

  7. Determination of Important Topographic Factors for Landslide Mapping Analysis Using MLP Network

    Directory of Open Access Journals (Sweden)

    Mutasem Sh. Alkhasawneh

    2013-01-01

    Full Text Available Landslide is one of the natural disasters that occur in Malaysia. Topographic factors such as elevation, slope angle, slope aspect, general curvature, plan curvature, and profile curvature are considered as the main causes of landslides. In order to determine the dominant topographic factors in landslide mapping analysis, a study was conducted and presented in this paper. There are three main stages involved in this study. The first stage is the extraction of extra topographic factors. Previous landslide studies had identified mainly six topographic factors. Seven new additional factors have been proposed in this study. They are longitude curvature, tangential curvature, cross section curvature, surface area, diagonal line length, surface roughness, and rugosity. The second stage is the specification of the weight of each factor using two methods. The methods are multilayer perceptron (MLP network classification accuracy and Zhou's algorithm. At the third stage, the factors with higher weights were used to improve the MLP performance. Out of the thirteen factors, eight factors were considered as important factors, which are surface area, longitude curvature, diagonal length, slope angle, elevation, slope aspect, rugosity, and profile curvature. The classification accuracy of multilayer perceptron neural network has increased by 3% after the elimination of five less important factors.

  8. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...

  9. Impact of reconstruction methods and pathological factors on survival after pancreaticoduodenectomy

    Directory of Open Access Journals (Sweden)

    Salah Binziad

    2013-01-01

    Full Text Available Background: Surgery remains the mainstay of therapy for pancreatic head (PH and periampullary carcinoma (PC and provides the only chance of cure. Improvements of surgical technique, increased surgical experience and advances in anesthesia, intensive care and parenteral nutrition have substantially decreased surgical complications and increased survival. We evaluate the effects of reconstruction type, complications and pathological factors on survival and quality of life. Materials and Methods: This is a prospective study to evaluate the impact of various reconstruction methods of the pancreatic remnant after pancreaticoduodenectomy and the pathological characteristics of PC patients over 3.5 years. Patient characteristics and descriptive analysis in the three variable methods either with or without stent were compared with Chi-square test. Multivariate analysis was performed with the logistic regression analysis test and multinomial logistic regression analysis test. Survival rate was analyzed by use Kaplan-Meier test. Results: Forty-one consecutive patients with PC were enrolled. There were 23 men (56.1% and 18 women (43.9%, with a median age of 56 years (16 to 70 years. There were 24 cases of PH cancer, eight cases of PC, four cases of distal CBD cancer and five cases of duodenal carcinoma. Nine patients underwent duct-to-mucosa pancreatico jejunostomy (PJ, 17 patients underwent telescoping pancreatico jejunostomy (PJ and 15 patients pancreaticogastrostomy (PG. The pancreatic duct was stented in 30 patients while in 11 patients, the duct was not stented. The PJ duct-to-mucosa caused significantly less leakage, but longer operative and reconstructive times. Telescoping PJ was associated with the shortest hospital stay. There were 5 postoperative mortalities, while postoperative morbidities included pancreatic fistula-6 patients, delayed gastric emptying in-11, GI fistula-3, wound infection-12, burst abdomen-6 and pulmonary infection-2. Factors

  10. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    Science.gov (United States)

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  11. Radionuclide analysis and scaling factors verification for LLRW of Taipower Reactor

    International Nuclear Information System (INIS)

    King, J.-Y.; Liu, K.-T.; Chen, S.-C.; Chang, T.-M.; Pung, T.-C.; Men, L.-C.; Wang, S.-J.

    2004-01-01

    The Atomic Energy Council of the Republic of China (CAEC) final disposal policy for Low Level Radwaste (LLRW) will be carried on in 1996. Institute of Nuclear Energy Research has the contract to develop the Radionuclide analysis method and to establish the scaling factors for LLRW of Taipower reactors. The radionuclides analyzed including: Co-60, Cs-137, Ce-144, γ-nuclides; H-3, C-14, Fe-55, Ni-59, Ni-63, Sr-90, Nb-94, Tc-99, I-129, Pu-238, Pu-239/240, Pu-241, Am-241, Cm-242, Cm-244 α, β and low energy γ nuclides. 120 samples taken from 21 waste streams were analyzed and the database was collected within 2 years. The scaling factors for different kind of waste streams were computed with weighted log-mean average method. In 1993, the scaling factors for each waste stream has been verified through actual station samples. (author)

  12. Survival analysis and classification methods for forest fire size

    Science.gov (United States)

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497

  13. Personality disorders in substance abusers: Validation of the DIP-Q through principal components factor analysis and canonical correlation analysis

    Directory of Open Access Journals (Sweden)

    Hesse Morten

    2005-05-01

    Full Text Available Abstract Background Personality disorders are common in substance abusers. Self-report questionnaires that can aid in the assessment of personality disorders are commonly used in assessment, but are rarely validated. Methods The Danish DIP-Q as a measure of co-morbid personality disorders in substance abusers was validated through principal components factor analysis and canonical correlation analysis. A 4 components structure was constructed based on 238 protocols, representing antagonism, neuroticism, introversion and conscientiousness. The structure was compared with (a a 4-factor solution from the DIP-Q in a sample of Swedish drug and alcohol abusers (N = 133, and (b a consensus 4-components solution based on a meta-analysis of published correlation matrices of dimensional personality disorder scales. Results It was found that the 4-factor model of personality was congruent across the Danish and Swedish samples, and showed good congruence with the consensus model. A canonical correlation analysis was conducted on a subset of the Danish sample with staff ratings of pathology. Three factors that correlated highly between the two variable sets were found. These variables were highly similar to the three first factors from the principal components analysis, antagonism, neuroticism and introversion. Conclusion The findings support the validity of the DIP-Q as a measure of DSM-IV personality disorders in substance abusers.

  14. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Mitsuyasu, T.; Ishii, K.; Hino, T.; Aoyama, M.

    2009-01-01

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  15. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  16. Methods in carbon K-edge NEXAFS: Experiment and analysis

    International Nuclear Information System (INIS)

    Watts, B.; Thomsen, L.; Dastoor, P.C.

    2006-01-01

    Near-edge X-ray absorption spectroscopy (NEXAFS) is widely used to probe the chemistry and structure of surface layers. Moreover, using ultra-high brilliance polarised synchrotron light sources, it is possible to determine the molecular alignment of ultra-thin surface films. However, the quantitative analysis of NEXAFS data is complicated by many experimental factors and, historically, the essential methods of calibration, normalisation and artefact removal are presented in the literature in a somewhat fragmented manner, thus hindering their integrated implementation as well as their further development. This paper outlines a unified, systematic approach to the collection and quantitative analysis of NEXAFS data with a particular focus upon carbon K-edge spectra. As a consequence, we show that current methods neglect several important aspects of the data analysis process, which we address with a combination of novel and adapted techniques. We discuss multiple approaches in solving the issues commonly encountered in the analysis of NEXAFS data, revealing the inherent assumptions of each approach and providing guidelines for assessing their appropriateness in a broad range of experimental situations

  17. Analysis of radiation-natural convection interactions in 1-G and low-G environments using the discrete exchange factor method

    International Nuclear Information System (INIS)

    Kassemi, M.

    1990-01-01

    In this paper a new numerical method is presented for the analysis of combined natural convection and radiation heat transfer which has application in many engineering situations such as materials processing, combustion and fire research. Because of the recent interest in the performance of these engineering processes in the low-gravity environment of space, attention is devoted to both 1-g and low-g applications. The numerical study is based on a two-dimensional mathematical model represented by a set of coupled nonlinear partial differential equations for conservation of mass, momentum, and energy and the integro-differential equations which describe radiative heat transfer. Radiative exchange is formulated using the discrete exchange factor method (DEF). This method considers point to point exchange and provides accurate results over a wide range of radiation parameters. The desirable features of DEF are briefly described. Our numerical results show that radiation significantly influences the flow and heat transfer in the enclosure. In both low-g and 1-g applications, radiation modifies the temperature profiles and enhances the convective heat transfer at the cold wall. In a low-g environment, convection is weak, and radiation can easily become the dominant heat transfer mode. It is also shown that in the top-heated enclosure, volumetric heating by radiation gives rise to an intricate cell pattern in the cavity

  18. The factorization method for inverse acoustic scattering in a layered medium

    International Nuclear Information System (INIS)

    Bondarenko, Oleksandr; Kirsch, Andreas; Liu, Xiaodong

    2013-01-01

    In this paper, we consider a problem of inverse acoustic scattering by an impenetrable obstacle embedded in a layered medium. We will show that the factorization method can be applied to recover the embedded obstacle; that is, the equation F-tilde g =φ z is solvable if and only if the sampling point z is in the interior of the unknown obstacle. Here, F-tilde is a self-adjoint operator related to the far field operator and ϕ z is the far field pattern of the Green function with respect to the problem of scattering by the background medium for point z. The validity of the factorization method is proven with the help of a mixed reciprocity principle and an application of the scattering operator. Due to the established mixed reciprocity principle, knowledge of the Green function for the background medium is no longer required, which makes the method attractive from the computational point of view. The paper is only concerned with sound-soft obstacles, but the analysis can be easily extended for sound-hard obstacles, or obstacles with separated sound-soft and sound-hard parts. Finally, we provide an explicit example for a radially symmetric case and present some numerical examples. (paper)

  19. An alternative method for centrifugal compressor loading factor modelling

    Science.gov (United States)

    Galerkin, Y.; Drozdov, A.; Rekstin, A.; Soldatova, K.

    2017-08-01

    The loading factor at design point is calculated by one or other empirical formula in classical design methods. Performance modelling as a whole is out of consideration. Test data of compressor stages demonstrates that loading factor versus flow coefficient at the impeller exit has a linear character independent of compressibility. Known Universal Modelling Method exploits this fact. Two points define the function - loading factor at design point and at zero flow rate. The proper formulae include empirical coefficients. A good modelling result is possible if the choice of coefficients is based on experience and close analogs. Earlier Y. Galerkin and K. Soldatova had proposed to define loading factor performance by the angle of its inclination to the ordinate axis and by the loading factor at zero flow rate. Simple and definite equations with four geometry parameters were proposed for loading factor performance calculated for inviscid flow. The authors of this publication have studied the test performance of thirteen stages of different types. The equations are proposed with universal empirical coefficients. The calculation error lies in the range of plus to minus 1,5%. The alternative model of a loading factor performance modelling is included in new versions of the Universal Modelling Method.

  20. Cross-Cultural Adaptation and Validation of the MPAM-R to Brazilian Portuguese and Proposal of a New Method to Calculate Factor Scores

    Science.gov (United States)

    Albuquerque, Maicon R.; Lopes, Mariana C.; de Paula, Jonas J.; Faria, Larissa O.; Pereira, Eveline T.; da Costa, Varley T.

    2017-01-01

    In order to understand the reasons that lead individuals to practice physical activity, researchers developed the Motives for Physical Activity Measure-Revised (MPAM-R) scale. In 2010, a translation of MPAM-R to Portuguese and its validation was performed. However, psychometric measures were not acceptable. In addition, factor scores in some sports psychology scales are calculated by the mean of scores by items of the factor. Nevertheless, it seems appropriate that items with higher factor loadings, extracted by Factor Analysis, have greater weight in the factor score, as items with lower factor loadings have less weight in the factor score. The aims of the present study are to translate, validate the MPAM-R for Portuguese versions, and investigate agreement between two methods used to calculate factor scores. Three hundred volunteers who were involved in physical activity programs for at least 6 months were collected. Confirmatory Factor Analysis of the 30 items indicated that the version did not fit the model. After excluding four items, the final model with 26 items showed acceptable model fit measures by Exploratory Factor Analysis, as well as it conceptually supports the five factors as the original proposal. When two methods are compared to calculate factors scores, our results showed that only “Enjoyment” and “Appearance” factors showed agreement between methods to calculate factor scores. So, the Portuguese version of the MPAM-R can be used in a Brazilian context, and a new proposal for the calculation of the factor score seems to be promising. PMID:28293203

  1. An automated Monte-Carlo based method for the calculation of cascade summing factors

    Science.gov (United States)

    Jackson, M. J.; Britton, R.; Davies, A. V.; McLarty, J. L.; Goodwin, M.

    2016-10-01

    A versatile method has been developed to calculate cascade summing factors for use in quantitative gamma-spectrometry analysis procedures. The proposed method is based solely on Evaluated Nuclear Structure Data File (ENSDF) nuclear data, an X-ray energy library, and accurate efficiency characterisations for single detector counting geometries. The algorithm, which accounts for γ-γ, γ-X, γ-511 and γ-e- coincidences, can be applied to any design of gamma spectrometer and can be expanded to incorporate any number of nuclides. Efficiency characterisations can be derived from measured or mathematically modelled functions, and can accommodate both point and volumetric source types. The calculated results are shown to be consistent with an industry standard gamma-spectrometry software package. Additional benefits including calculation of cascade summing factors for all gamma and X-ray emissions, not just the major emission lines, are also highlighted.

  2. The use of human factors methods to identify and mitigate safety issues in radiation therapy

    International Nuclear Information System (INIS)

    Chan, Alvita J.; Islam, Mohammad K.; Rosewall, Tara; Jaffray, David A.; Easty, Anthony C.; Cafazzo, Joseph A.

    2010-01-01

    Background and purpose: New radiation therapy technologies can enhance the quality of treatment and reduce error. However, the treatment process has become more complex, and radiation dose is not always delivered as intended. Using human factors methods, a radiotherapy treatment delivery process was evaluated, and a redesign was undertaken to determine the effect on system safety. Material and methods: An ethnographic field study and workflow analysis was conducted to identify human factors issues of the treatment delivery process. To address specific issues, components of the user interface were redesigned through a user-centered approach. Sixteen radiation therapy students were then used to experimentally evaluate the redesigned system through a usability test to determine the effectiveness in mitigating use errors. Results: According to findings from the usability test, the redesigned system successfully reduced the error rates of two common errors (p < .04 and p < .01). It also improved the mean task completion time by 5.5% (p < .02) and achieved a higher level of user satisfaction. Conclusions: These findings demonstrated the importance and benefits of applying human factors methods in the design of radiation therapy systems. Many other opportunities still exist to improve patient safety in this area using human factors methods.

  3. Factoral analysis of the cost of preparing oil

    Energy Technology Data Exchange (ETDEWEB)

    Avdeyeva, L A; Kudoyarov, G Sh; Shmatova, M F

    1979-01-01

    Mathematical statistics methods (basically correlational and regression analysis) are used to study the factors which form the level of cost of preparing oil with consideration of the mutual influence of the factors. Selected as the claims for inclusion into a mathematical model was a group of five a priori justified factors: the water level of the oil being extracted (%); the specific expenditure of deemulsifiers; the volume of oil preparation; the quality of oil preparation (the salt content) and the level of use of the installations' capacities (%). To construct an economic and mathematical model of the cost of the technical preparation (SPP) of the oil, all the unions which make up the Ministry of the Oil Industry were divided into two comparable totalities. The first group included unions in which the oil SPP was lower than the branch average and the second, unions in which the SPP was higher than the branch wide cost. Using the coefficients of regression, special elasticity coefficients and the fluctuation indicators, the basic factors were finally identified which have the greatest influence on the formation of the oil SPP level separately for the first and second groups of unions.

  4. Analysis of the financial factors governing the profitability of lunar helium-3

    Science.gov (United States)

    Kulcinski, G. L.; Thompson, H.; Ott, S.

    1989-01-01

    Financial factors influencing the profitability of the mining and utilization of lunar helium-3 are examined. The analysis addressed the following questions: (1) which financial factors have the greatest leverage on the profitability of He-3; (2) over what range can these factors be varied to keep the He-3 option profitable; and (3) what ultimate effect could this energy source have on the price of electricity for U.S. consumers. Two complementary methods of analysis were used in the assessment: rate of return on incremental investment required and reduction revenue requirements (total cost to customers) achieved. Some of the factors addressed include energy demand, power generation costs with and without fusion, profitability for D-He(3) fusion, annual capital and operating costs, launch mass and costs, He-3 price, and government funding. Specific conclusions are made with respect to each of the companies considered: utilities, lunar mining company, and integrated energy company.

  5. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    Directory of Open Access Journals (Sweden)

    I. Crawford

    2015-11-01

    Full Text Available In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4 where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio–hydro–atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen–Rocky Mountain Biogenic Aerosol Study ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the

  6. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  7. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    International Nuclear Information System (INIS)

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-01-01

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  8. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    Science.gov (United States)

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  9. Strength Analysis on Ship Ladder Using Finite Element Method

    Science.gov (United States)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  10. Exploratory Factor Analysis of SCL90-R Symptoms Relevant to Psychosis

    Directory of Open Access Journals (Sweden)

    Javad Amini

    2011-10-01

    Full Text Available "nObjective: Inconsistent results have been reported regarding the symptom dimensions relevant to psychosis in symptoms check list revised (SCL90-R, i.e., "psychoticism" and "paranoid ideation". Therefore, some studies have suggested different factor structures for questions of these two dimensions, and proposed two newly defined dimensions of "schizotypal signs" and "schizophrenia nuclear symptoms". We conducted an exploratory factor analysis on the items of these two dimensions in a general population sample in Iran. "nMethod: A total of 2158 subjects residing in Southern Tehran (capital of Iran were interviewed using the psychoticism and paranoid ideation questions in SCL90-R to assess severity of these symptom dimensions. Factor analysis was done through SAS 9.1.3 PROC FACTOR using Promax rotation (power=3 on the matrix of "polychoric correlations among variables" as the input data. "nResults: Two factors were retained by the proportion criterion. Considering loadings >= 0.5 as minimum criteria for factor loadings, 7 out of 10 questions  from psychoticism ,and 3 out of 6 questions from paranoid ideation were retained, and others were eliminated. The factor labels proposed by the questionnaire suited the extracted factors and were retained. Internal consistency for each of the dimensions was acceptable (Cronbach's alpha 0.7 and 0.74 for paranoid ideation and psychoticism respectively. Composite scores showed a half-normal distribution for both dimensions which is predictable for instruments that detect psychotic symptoms. "nConclusion: Results were in contrast with similar studies, and questioned them by suggesting a different factor structure obtained from a statistically large population. The population in a developing nation (Iran in this study and the socio-cultural differences in developed settings are the potential sources for discrepancies between this analysis and previous reports.

  11. Comparison of the Effects of the Different Methods for Computing the Slope Length Factor at a Watershed Scale

    Directory of Open Access Journals (Sweden)

    Fu Suhua

    2013-09-01

    Full Text Available The slope length factor is one of the parameters of the Universal Soil Loss Equation (USLE and the Revised Universal Soil Loss Equation (RUSLE and is sometimes calculated based on a digital elevation model (DEM. The methods for calculating the slope length factor are important because the values obtained may depend on the methods used for calculation. The purpose of this study was to compare the difference in spatial distribution of the slope length factor between the different methods at a watershed scale. One method used the uniform slope length factor equation (USLFE where the effects of slope irregularities (such as slope gradient, etc. on soil erosion by water were not considered. The other method used segmented slope length factor equation(SSLFE which considered the effects of slope irregularities on soil erosion by water. The Arc Macro Language (AML Version 4 program for the revised universal soil loss equation(RUSLE.which uses the USLFE, was chosen to calculate the slope length factor. In a parallel analysis, the AML code of RUSLE Version 4 was modified according to the SSLFE to calculate the slope length factor. Two watersheds with different slope and gully densities were chosen. The results show that the slope length factor and soil loss using the USLFE method were lower than those using the SSLFE method, especially on downslopes watershed with more frequent steep slopes and higher gully densities. In addition, the slope length factor and soil loss calculated by the USLFE showed less spatial variation.

  12. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor, k/sub eff/ has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments and the development of theoretical methods to predict the experimental observables

  13. 252Cf-source-driven neutron noise analysis method

    International Nuclear Information System (INIS)

    Mihalczo, J.T.; King, W.T.; Blakeman, E.D.

    1985-01-01

    The 252 Cf-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k/sub eff/ has been satisfactorily detemined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k/sub eff/ values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables

  14. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Iida, Hiroyasu

    2011-01-01

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  15. Multivariate sensitivity analysis to measure global contribution of input factors in dynamic models

    International Nuclear Information System (INIS)

    Lamboni, Matieyendou; Monod, Herve; Makowski, David

    2011-01-01

    Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006 ) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis. The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.

  16. Multivariate sensitivity analysis to measure global contribution of input factors in dynamic models

    Energy Technology Data Exchange (ETDEWEB)

    Lamboni, Matieyendou [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Monod, Herve, E-mail: herve.monod@jouy.inra.f [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Makowski, David [INRA, UMR Agronomie INRA/AgroParisTech (UMR 211), BP 01, F78850 Thiverval-Grignon (France)

    2011-04-15

    Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis. The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.

  17. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications

    Directory of Open Access Journals (Sweden)

    Habib Messai

    2016-11-01

    Full Text Available Background. Olive oils (OOs show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i characterization by specific markers; (ii authentication by fingerprint patterns; and (iii monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.

  18. Chemometrics Methods for Specificity, Authenticity and Traceability Analysis of Olive Oils: Principles, Classifications and Applications

    Science.gov (United States)

    Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil

    2016-01-01

    Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172

  19. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    Science.gov (United States)

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  20. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use......This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  1. Research on the relationship between the elements and pharmacological activities in velvet antler using factor analysis and cluster analysis

    Science.gov (United States)

    Zhou, Libing

    2017-04-01

    Velvet antler has certain effect on improving the body's immune cells and the regulation of immune system function, nervous system, anti-stress, anti-aging and osteoporosis. It has medicinal applications to treat a wide range of diseases such as tissue wound healing, anti-tumor, cardiovascular disease, et al. Therefore, the research on the relationship between pharmacological activities and elements in velvet antler is of great significance. The objective of this study was to comprehensively evaluate 15 kinds of elements in different varieties of velvet antlers and study on the relationship between the elements and traditional Chinese medicine efficacy for the human. The factor analysis and the factor cluster analysis methods were used to analyze the data of elements in the sika velvet antler, cervus elaphus linnaeus, flower horse hybrid velvet antler, apiti (elk) velvet antler, male reindeer velvet antler and find out the relationship between 15 kinds of elements including Ca, P, Mg, Na, K, Fe, Cu, Mn, Al, Ba, Co, Sr, Cr, Zn and Ni. Combining with MATLAB2010 and SPSS software, the chemometrics methods were made on the relationship between the elements in velvet antler and the pharmacological activities. The first commonality factor F1 had greater load on the indexes of Ca, P, Mg, Co, Sr and Ni, and the second commonality factor F2 had greater load on the indexes of K, Mn, Zn and Cr, and the third commonality factor F3 had greater load on the indexes of Na, Cu and Ba, and the fourth commonality factor F4 had greater load on the indexes of Fe and Al. 15 kinds of elements in velvet antler in the order were elk velvet antler>flower horse hybrid velvet antler>cervus elaphus linnaeus>sika velvet antler>male reindeer velvet antler. Based on the factor analysis and the factor cluster analysis, a model for evaluating traditional Chinese medicine quality was constructed. These studies provide the scientific base and theoretical foundation for the future large-scale rational

  2. Clinical usefulness of physiological components obtained by factor analysis

    International Nuclear Information System (INIS)

    Ohtake, Eiji; Murata, Hajime; Matsuda, Hirofumi; Yokoyama, Masao; Toyama, Hinako; Satoh, Tomohiko.

    1989-01-01

    The clinical usefulness of physiological components obtained by factor analysis was assessed in 99m Tc-DTPA renography. Using definite physiological components, another dynamic data could be analyzed. In this paper, the dynamic renal function after ESWL (Extracorporeal Shock Wave Lithotripsy) treatment was examined using physiological components in the kidney before ESWL and/or a normal kidney. We could easily evaluate the change of renal functions by this method. The usefulness of a new analysis using physiological components was summarized as follows: 1) The change of a dynamic function could be assessed in quantity as that of the contribution ratio. 2) The change of a sick condition could be morphologically evaluated as that of the functional image. (author)

  3. A Cross-Section Adjustment Method for Double Heterogeneity Problem in VHTGR Analysis

    International Nuclear Information System (INIS)

    Yun, Sung Hwan; Cho, Nam Zin

    2011-01-01

    Very High Temperature Gas-Cooled Reactors (VHTGRs) draw strong interest as candidates for a Gen-IV reactor concept, in which TRISO (tristructuralisotropic) fuel is employed to enhance the fuel performance. However, randomly dispersed TRISO fuel particles in a graphite matrix induce the so-called double heterogeneity problem. For design and analysis of such reactors with the double heterogeneity problem, the Monte Carlo method is widely used due to its complex geometry and continuous-energy capabilities. However, its huge computational burden, even in the modern high computing power, is still problematic to perform wholecore analysis in reactor design procedure. To address the double heterogeneity problem using conventional lattice codes, the RPT (Reactivityequivalent Physical Transformation) method considers a homogenized fuel region that is geometrically transformed to provide equivalent self-shielding effect. Another method is the coupled Monte Carlo/Collision Probability method, in which the absorption and nu-fission resonance cross-section libraries in the deterministic CPM3 lattice code are modified group-wise by the double heterogeneity factors determined by Monte Carlo results. In this paper, a new two-step Monte Carlo homogenization method is described as an alternative to those methods above. In the new method, a single cross-section adjustment factor is introduced to provide self-shielding effect equivalent to the self-shielding in heterogeneous geometry for a unit cell of compact fuel. Then, the homogenized fuel compact material with the equivalent cross-section adjustment factor is used in continuous-energy Monte Carlo calculation for various types of fuel blocks (or assemblies). The procedure of cross-section adjustment is implemented in the MCNP5 code

  4. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  5. Analysis of soybean production and import trends and its import factors in Indonesia

    Science.gov (United States)

    Ningrum, I. H.; Irianto, H.; Riptanti, E. W.

    2018-03-01

    This study aims to analyze the factors affecting soybean imports in Indonesia and to know the trend and projection of Indonesian soybean production as well as the import in 2016-2020. The basic method used in this research is the description analysis method. The data used are secondary data in the form of time series data from 1979-2015. Methods of data analysis using simultaneous equations model with 2SLS (Two Stage Least Square) method and Trend analysis. The results showed that the factors affecting soybean imports in Indonesia are consumption and production. Consumption has positive effect while production is negatively affected. The percentage changed in soybean imports is greater than the percentage change in consumption and production of soybeans. Consumption is positively influenced by imports and production, while production is influenced positively by consumption and negative by imports. The production trend of soybean in 2016-2020 has a tendency to increase with a percentage of 11.18% per year. Production in 2016 is projected at 1.110.537 tons while in 2020 it will increase to 1,721,350 tons. The import trend in 2016-2020 has a tendency to increase with an average percentage of 4.13% per year. Import in 2016 is projected at 2.224.188 tons while in 2020 it will increase to 2.611.270 tons.

  6. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  7. Areva fatigue concept. Fast fatigue evaluation, a new method for fatigue analysis

    International Nuclear Information System (INIS)

    Heinz, Benedikt; Bergholz, Steffen; Rudolph, Juergen

    2011-01-01

    Within the discussions on the long term operation (LTO) of nuclear power plants the ageing management is on the focus of that analysis. The knowledge of the operational thermal cyclic load data on components of the power plants and their evaluation in the fatigue analysis is a central concern. The changes in fatigue requirements (e.g. the consideration of environmentally assisted fatigue - EAF) recently discussed and LTO efforts are a strong motivation for the identification of margins in the existing fatigue analysis approaches. These margins should be considered within new approaches in order to obtain realistic (or more accurate) analysis results. Of course, these new analysis approaches have to be manageable and efficient. The Areva Fatigue Concept (AFC) offers the comprehensive conceptual basis for the consideration of fatigue on different levels and depths. The combination of data logging and automated fatigue evaluation are important modules of the AFC. Besides the established simplified stress based fatigue estimation Areva develops a further automated fatigue analysis method called Fast Fatigue Evaluation (FFE). This method comprises highly automated stress analyses at the fatigue relevant locations of the component. Hence, a component specific course of stress as a function of time is determined based on FAMOS or similar temperature measurement systems. The subsequent application of the rain flow cycle counting algorithm allows for the determination of the usage factor following the rules of the design code requirements. The new FFE approach constitutes a cycle counting method based on the real stresses in the component, and determined as result a rule-conformity cumulative usage factor. (orig.)

  8. Impact of the Choice of Normalization Method on Molecular Cancer Class Discovery Using Nonnegative Matrix Factorization.

    Science.gov (United States)

    Yang, Haixuan; Seoighe, Cathal

    2016-01-01

    Nonnegative Matrix Factorization (NMF) has proved to be an effective method for unsupervised clustering analysis of gene expression data. By the nonnegativity constraint, NMF provides a decomposition of the data matrix into two matrices that have been used for clustering analysis. However, the decomposition is not unique. This allows different clustering results to be obtained, resulting in different interpretations of the decomposition. To alleviate this problem, some existing methods directly enforce uniqueness to some extent by adding regularization terms in the NMF objective function. Alternatively, various normalization methods have been applied to the factor matrices; however, the effects of the choice of normalization have not been carefully investigated. Here we investigate the performance of NMF for the task of cancer class discovery, under a wide range of normalization choices. After extensive evaluations, we observe that the maximum norm showed the best performance, although the maximum norm has not previously been used for NMF. Matlab codes are freely available from: http://maths.nuigalway.ie/~haixuanyang/pNMF/pNMF.htm.

  9. The recovery factors analysis of the human errors for research reactors

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Turcu, I.; Florescu, Ghe.

    2006-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to systems unavailability of the nuclear installations. The treatment of human interactions is considered one of the major limitations in the context of PSA. To identify those human actions that can have an effect on system reliability or availability applying the Human Reliability Analysis (HRA) is necessary. The recovery factors analysis of the human action is an important step in HRA. This paper presents how can be reduced the human errors probabilities (HEP) using those elements that have the capacity to recovery human error. The recovery factors modeling is marked to identify error likelihood situations or situations that conduct at development of the accident. This analysis is realized by THERP method. The necessary information was obtained from the operating experience of the research reactor TRIGA of the INR Pitesti. The required data were obtained from generic databases. (authors)

  10. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  11. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  12. Investigation of evaluation methods for human factors education effectiveness

    International Nuclear Information System (INIS)

    Yoshimura, Seiichi; Fujimoto, Junzo; Sasou Kunihide; Hasegawa, Naoko

    2004-01-01

    Education effectiveness in accordance with investment is required in the steam of electric power regulation alleviation. Therefore, evaluation methods for human factors education effectiveness which can observe human factors culture pervading process were investigated through research activities on education effectiveness in universities and actual in house education in industry companies. As a result, the contents of evaluation were found to be the change of feeling for human factors and some improving proposals in work places when considering the purpose of human factors education. And, questionnaire is found to be suitable for the style of evaluation. In addition, the timing of evaluation is desirable for both just after education and after some period in work places. Hereafter, data will be collected using these two kinds of questionnaires in human factors education courses in CRIEPI and some education courses in utilities. Thus, education effectiveness evaluation method which is suitable for human factors will be established. (author)

  13. A feasibility study on age-related factors of wrist pulse using principal component analysis.

    Science.gov (United States)

    Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim

    2016-08-01

    Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.

  14. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  15. Dirac equation in low dimensions: The factorization method

    Energy Technology Data Exchange (ETDEWEB)

    Sánchez-Monroy, J.A., E-mail: antosan@if.usp.br [Instituto de Física, Universidade de São Paulo, 05508-090, São Paulo, SP (Brazil); Quimbay, C.J., E-mail: cjquimbayh@unal.edu.co [Departamento de Física, Universidad Nacional de Colombia, Bogotá, D. C. (Colombia); CIF, Bogotá (Colombia)

    2014-11-15

    We present a general approach to solve the (1+1) and (2+1)-dimensional Dirac equations in the presence of static scalar, pseudoscalar and gauge potentials, for the case in which the potentials have the same functional form and thus the factorization method can be applied. We show that the presence of electric potentials in the Dirac equation leads to two Klein–Gordon equations including an energy-dependent potential. We then generalize the factorization method for the case of energy-dependent Hamiltonians. Additionally, the shape invariance is generalized for a specific class of energy-dependent Hamiltonians. We also present a condition for the absence of the Klein paradox (stability of the Dirac sea), showing how Dirac particles in low dimensions can be confined for a wide family of potentials. - Highlights: • The low-dimensional Dirac equation in the presence of static potentials is solved. • The factorization method is generalized for energy-dependent Hamiltonians. • The shape invariance is generalized for energy-dependent Hamiltonians. • The stability of the Dirac sea is related to the existence of supersymmetric partner Hamiltonians.

  16. Some new results on correlation-preserving factor scores prediction methods

    NARCIS (Netherlands)

    Ten Berge, J.M.F.; Krijnen, W.P.; Wansbeek, T.J.; Shapiro, A.

    1999-01-01

    Anderson and Rubin and McDonald have proposed a correlation-preserving method of factor scores prediction which minimizes the trace of a residual covariance matrix for variables. Green has proposed a correlation-preserving method which minimizes the trace of a residual covariance matrix for factors.

  17. Calculation method for residual stress analysis of filament-wound spherical pressure vessels

    International Nuclear Information System (INIS)

    Knight, C.E. Jr.

    1976-01-01

    Filament wound spherical pressure vessels may be produced with very high performance factors. These performance factors are a calculation of contained pressure times enclosed volume divided by structure weight. A number of parameters are important in determining the level of performance achieved. One of these is the residual stress state in the fabricated unit. A significant level of an unfavorable residual stress state could seriously impair the performance of the vessel. Residual stresses are of more concern for vessels with relatively thick walls and/or vessels constructed with the highly anisotropic graphite or aramid fibers. A method is established for measuring these stresses. A theoretical model of the composite structure is required. Data collection procedures and techniques are developed. The data are reduced by means of the model and result in the residual stress analysis. The analysis method can be used in process parameter studies to establish the best fabrication procedures

  18. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  19. Pediatric nurses' perception of factors associated with caring self-efficacy: A qualitative content analysis.

    Science.gov (United States)

    Alavi, Azam; Bahrami, Masoud; Zargham-Boroujeni, Ali; Yousefy, Alireza

    2015-01-01

    Nurses, who are considered to form the largest group of professional healthcare providers, face the challenge of maintaining, promoting, and providing quality nursing care and to prepare themselves to function confidently and to care effectively. Among the factors affecting nursing performance, self-efficacy has been expected to have the greatest influence. However, the concept of caring self-efficacy was not considered and no research has been done in this field in Iran. This study was conducted to explore and identify the factors described by pediatric nurses as related to caring self-efficacy. This is a qualitative study conducted through content analysis in 2013 in Iran. Twenty-four participants were selected through purposive sampling method from pediatric nurses and educators. Data were collected through semi-structured interviews. Data were analyzed using conventional content analysis method. The analysis of the interviews in this study led to the development of four main themes: (1) Professional knowledge of children caring, (2) experience, (3) caring motivation, and (4) efficient educational system as the factors influencing caring self-efficacy perception of pediatric nurses. This article presents the factors associated with the perception of caring self-efficacy in pediatric nurses' perspective. This finding can be used by nursing administrators and instructors, especially in the area of pediatric caring, to enhance nursing professional practice and the quality of pediatric caring.

  20. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  1. Supervised Cross-Modal Factor Analysis for Multiple Modal Data Classification

    KAUST Repository

    Wang, Jingbin

    2015-10-09

    In this paper we study the problem of learning from multiple modal data for purpose of document classification. In this problem, each document is composed two different modals of data, i.e., An image and a text. Cross-modal factor analysis (CFA) has been proposed to project the two different modals of data to a shared data space, so that the classification of a image or a text can be performed directly in this space. A disadvantage of CFA is that it has ignored the supervision information. In this paper, we improve CFA by incorporating the supervision information to represent and classify both image and text modals of documents. We project both image and text data to a shared data space by factor analysis, and then train a class label predictor in the shared space to use the class label information. The factor analysis parameter and the predictor parameter are learned jointly by solving one single objective function. With this objective function, we minimize the distance between the projections of image and text of the same document, and the classification error of the projection measured by hinge loss function. The objective function is optimized by an alternate optimization strategy in an iterative algorithm. Experiments in two different multiple modal document data sets show the advantage of the proposed algorithm over other CFA methods.

  2. The use of graph theory in the sensitivity analysis of the model output: a second order screening method

    International Nuclear Information System (INIS)

    Campolongo, Francesca; Braddock, Roger

    1999-01-01

    Sensitivity analysis screening methods aim to isolate the most important factors in experiments involving a large number of significant factors and interactions. This paper extends the one-factor-at-a-time screening method proposed by Morris. The new method, in addition to the 'overall' sensitivity measures already provided by the traditional Morris method, offers estimates of the two-factor interaction effects. The number of model evaluations required is O(k 2 ), where k is the number of model input factors. The efficient sampling strategy in the parameter space is based on concepts of graph theory and on the solution of the 'handcuffed prisoner problem'

  3. A new modification of summary-based analysis method for large software system testing

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The automated testing tools becoming a frequent practice require thorough computer-aided testing of large software systems, including system inter-component interfaces. To achieve a good coverage, one should overcome scalability problems of different methods of analysis. These problems arise from impossibility to analyze all the execution paths. The objective of this research is to build a method for inter-procedural analysis, which efficiency enables us to analyse large software systems (such as Android OS codebase as a whole for a reasonable time (no more than 4 hours. This article reviews existing methods of software analysis to detect their potential defects. It focuses on the symbolic execution method since it is widely used both in static analysis of source code and in hybrid analysis of object files and intermediate representation (concolic testing. The method of symbolic execution involves separation of a set of input data values into equivalence classes while choosing an execution path. The paper also considers advantages of this method and its shortcomings. One of the main scalability problems is related to inter-procedural analysis. Analysis time grows rapidly if an inlining method is used for inter-procedural analysis. So this work proposes a summary-based analysis method to solve scalability problems. Clang Static Analyzer, an open source static analyzer (a part of the LLVM project, has been chosen as a target system. It allows us to compare performance of inlining and summary-based inter-procedural analysis. A mathematical model for preliminary estimations is described in order to identify possible factors of performance improvement.

  4. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    Stefanescu, Petre; Mihailescu, Nicolae; Dragusin, Octavian

    1999-01-01

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  5. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  6. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    Science.gov (United States)

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (Pregression analysis of total resection-related factors showed that total resection should be the preferred treatment for patients with benign tumors, thoracic and lumbosacral tumors, and lower McCormick grade, as well as patients without syringomyelia and intramedullary tumors. Logistic regression analysis of recurrence-related factors revealed that the recurrence rate was relatively higher in patients with malignant, cervical, thoracic and lumbosacral, intramedullary tumors, and higher Mc

  7. Data collection on the unit control room simulator as a method of operator reliability analysis

    International Nuclear Information System (INIS)

    Holy, J.

    1998-01-01

    The report consists of the following chapters: (1) Probabilistic assessment of nuclear power plant operation safety and human factor reliability analysis; (2) Simulators and simulations as human reliability analysis tools; (3) DOE project for using the collection and analysis of data from the unit control room simulator in human factor reliability analysis at the Paks nuclear power plant; (4) General requirements for the organization of the simulator data collection project; (5) Full-scale simulator at the Nuclear Power Plants Research Institute in Trnava, Slovakia, used as a training means for operators of the Dukovany NPP; (6) Assessment of the feasibility of quantification of important human actions modelled within a PSA study by employing simulator data analysis; (7) Assessment of the feasibility of using the various exercise topics for the quantification of the PSA model; (8) Assessment of the feasibility of employing the simulator in the analysis of the individual factors affecting the operator's activity; and (9) Examples of application of statistical methods in the analysis of the human reliability factor. (P.A.)

  8. [Various methods of dynamic functional analysis in human sciences and economics].

    Science.gov (United States)

    Schiltz, J

    2006-01-01

    Including the temporal and developmental dimension into the measurement of human conduct is a fundamental concern for those who do research in natural surroundings. Observing an individual day after day may possibly give a more complete vision of how behavior works than measuring a group of individuals at a single time and analyzing the differences found among them. Unfortunately most of the tools allowing analyzing individual time series call for large numbers of repeated observations. Thus, practicable longitudinal research designs often do not involve either enough repeated measurements for traditional time series analyses nor either replicate enough individuals for traditional, large-sample analyses. Dynamic factor analysis is a rationale and procedure for both pooling relatively short time series information across limited numbers of participants and analyzing the pooled information for its dynamic, process-relevant elements. It is a merging of two important analytical tools - multivariate time series and the common factor model, from which it distinguishes itself mainly by the fact that in dynamic factor analysis, the values of the common factors can influence the values of the observed variables both concurrently and in delayed fashion. Dynamic factor analysis is actually a method which allows detecting structures in the time series as well as the relations between the series and the explanatory variables. We illustrate the different models used in psychology and social sciences, as well as in econometry and economics.

  9. Multinomial Response Models, for Modeling and Determining Important Factors in Different Contraceptive Methods in Women

    Directory of Open Access Journals (Sweden)

    E Haji Nejad

    2001-06-01

    Full Text Available Difference aspects of multinomial statistical modelings and its classifications has been studied so far. In these type of problems Y is the qualitative random variable with T possible states which are considered as classifications. The goal is prediction of Y based on a random Vector X ? IR^m. Many methods for analyzing these problems were considered. One of the modern and general method of classification is Classification and Regression Trees (CART. Another method is recursive partitioning techniques which has a strange relationship with nonparametric regression. Classical discriminant analysis is a standard method for analyzing these type of data. Flexible discriminant analysis method which is a combination of nonparametric regression and discriminant analysis and classification using spline that includes least square regression and additive cubic splines. Neural network is an advanced statistical method for analyzing these types of data. In this paper properties of multinomial logistics regression were investigated and this method was used for modeling effective factors in selecting contraceptive methods in Ghom province for married women age 15-49. The response variable has a tetranomial distibution. The levels of this variable are: nothing, pills, traditional and a collection of other contraceptive methods. A collection of significant independent variables were: place, age of women, education, history of pregnancy and family size. Menstruation age and age at marriage were not statistically significant.

  10. Analysis of risk factors in the development of retinopathy of prematurity

    Directory of Open Access Journals (Sweden)

    Knežević Sanja

    2011-01-01

    Full Text Available Introduction. Retinopathy of prematurity (ROP is a multifactorial disease that occurs most frequently in very small and very sick preterm infants, and it has been identified as the major cause of childhood blindness. Objective. The aim of this study was to evaluate ROP incidence and risk factors associated with varying degrees of illness. Methods. The study was conducted at the Centre for Neonatology, Paediatric Clinic of the Clinical Centre Kragujevac, Serbia, in the period from June 2006 to December 2008. Ophthalmologic screening was performed in all children with body weight lower than 2000 g or gestational age lower than 36 weeks. We analyzed eighteen postnatal and six perinatal risk factors and the group correlations for each of the risk factors. Results. Out of 317 children that were screened, 56 (17.7% developed a mild form of ROP, while 68 (21.5% developed a severe form. Univariate analysis revealed a large number of statistically significant risk factors for the development of ROP, especially the severe form. Multivariate logistical analysis further separated two independent risk factors: small birth weight (p=0.001 and damage of central nervous system (p=0.01. Independent risk factors for transition from mild to severe forms of ROP were identified as: small birth weight (p=0.05 and perinatal risk factors (p=0.02. Conclusion. Small birth weight and central nervous system damage were risk factors for the development of ROP, perinatal risk factors were identified as significant for transition from mild to severe form of ROP.

  11. A method and application study on holistic decision tree for human reliability analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Sun Feng; Zhong Shan; Wu Zhiyu

    2008-01-01

    The paper introduces a human reliability analysis method mainly used in Nuclear Power Plant Safety Assessment and the Holistic Decision Tree (HDT) method and how to apply it. The focus is primarily on providing the basic framework and some background of HDT method and steps to perform it. Influence factors and quality descriptors are formed by the interview with operators in Qinshan Nuclear Power Plant and HDT analysis performed for SGTR and SLOCA based on this information. The HDT model can use a graphic tree structure to indicate that error rate is a function of influence factors. HDT method is capable of dealing with the uncertainty in HRA, and it is reliable and practical. (authors)

  12. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  13. An overview on applied methods in the FRG to investigate human factors in control rooms of nuclear power plants

    International Nuclear Information System (INIS)

    Thomas, D.B.

    1985-01-01

    In the first half of 1984 a feasibility study was carried out with respect to the CSNI of the OECD/NEA inventory of methods for the analysis and evaluation of human factors in the control room of nuclear power plants. In order to enable an analysis of the methods to be made, an elementary categorization of the methods under field studies, laboratory studies and theoretical studies was performed. A further differentiation of these categories was used as the basis for a critical analysis and interpretation of the methods employed in the research plan. In the following sections, an explanation is given of the method categories used and the plans included in the investigation. A short representation is given of the breakdown of the applied methods into categories and an analysis is made of the results. Implications for research programs are discussed. (orig./GL) [de

  14. Calculation of mixed mode stress intensity factors using an alternating method

    International Nuclear Information System (INIS)

    Sakai, Takayuki

    1999-01-01

    In this study, mixed mode stress intensity factors (K I and K II ) of a square plate with a notch were calculated using a finite element alternating method. The obtained results were compared with the ones by a finite element method, and it was shown that the finite element alternating method can accurately estimate mixed mode stress intensity factors. Then, using this finite element alternating method, mixed mode stress intensity factors were calculated as changing the size and position of the notch, and its simplified equations were proposed. (author)

  15. Long-term adherence to a local guideline on postoperative body temperature measurement: mixed methods analysis

    NARCIS (Netherlands)

    Storm-Versloot, Marja N.; Knops, Anouk M.; Ubbink, Dirk T.; Goossens, Astrid; Legemate, Dink A.; Vermeulen, Hester

    2012-01-01

    Aim To find out whether a successful multifaceted implementation approach of a local evidence-based guideline on postoperative body temperature measurements (BTM) was persistent over time, and which factors influenced long-term adherence. Methods Mixed methods analysis. Patient records were

  16. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    Science.gov (United States)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  17. Reliability Analysis of Offshore Jacket Structures with Wave Load on Deck using the Model Correction Factor Method

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Friis-Hansen, P.; Nielsen, J.S.

    2006-01-01

    failure/collapse of jacket type platforms with wave in deck loads using the so-called Model Correction Factor Method (MCFM). A simple representative model for the RSR measure is developed and used in the MCFM technique. A realistic example is evaluated and it is seen that it is possible to perform...

  18. Human factors estimation methods in nuclear power plant

    International Nuclear Information System (INIS)

    Takano, Kenichi; Yoshino, Kenji; Nagasaka, Akihiko

    1986-01-01

    The diffinitions and models of mental work-loads are investigated, consequently the most simple and reasonable one is the single channel model, and the channel has limited capacity. The capacity depends on the time related to the brain information processings, like as the recognizations by eyes or ears etc., and the judgements by memory or experience etc., and the actions. In this paper the mental work load is diffined by the relative needed time of such information processing compared to total capacity. Based on the above diffinitions, the model experiment is carried out, the main task is simple additional task of the two digits displayed on a CRT and varying the additional speed from 10 cycle/min. - 60 cycle/min. Four techniques to measure the mental work-load, (1) the task time analysis method, (2) the physiological method, (3) the secondary task method, (4) the subjective method, are examined in the respects of the sensitivity and validity. The measured values gained by the physiological method and the secondary task method and subjective method are compared to those of the task time analysis results, because the task time analysis method is most faithfull to the diffinitions. (author)

  19. Scale factor measure method without turntable for angular rate gyroscope

    Science.gov (United States)

    Qi, Fangyi; Han, Xuefei; Yao, Yanqing; Xiong, Yuting; Huang, Yuqiong; Wang, Hua

    2018-03-01

    In this paper, a scale factor test method without turntable is originally designed for the angular rate gyroscope. A test system which consists of test device, data acquisition circuit and data processing software based on Labview platform is designed. Taking advantage of gyroscope's sensitivity of angular rate, a gyroscope with known scale factor, serves as a standard gyroscope. The standard gyroscope is installed on the test device together with a measured gyroscope. By shaking the test device around its edge which is parallel to the input axis of gyroscope, the scale factor of the measured gyroscope can be obtained in real time by the data processing software. This test method is fast. It helps test system miniaturized, easy to carry or move. Measure quarts MEMS gyroscope's scale factor multi-times by this method, the difference is less than 0.2%. Compare with testing by turntable, the scale factor difference is less than 1%. The accuracy and repeatability of the test system seems good.

  20. Human factors review for Severe Accident Sequence Analysis (SASA)

    International Nuclear Information System (INIS)

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure

  1. Methods for Analysis of Urban Energy Systems: A New York City Case Study

    Science.gov (United States)

    Howard, Bianca

    This dissertation describes methods developed for analysis of the New York City energy system. The analysis specifically aims to consider the built environment and its' impacts on greenhouse gas (GHG) emissions. Several contributions to the urban energy systems literature were made. First, estimates of annual energy intensities of the New York building stock were derived using a statistical analysis that leveraged energy consumption and tax assessor data collected by the Office of the Mayor. These estimates provided the basis for an assessment of the spatial distribution of building energy consumption. The energy consumption estimates were then leveraged to estimate the potential for combined heat and power (CHP) systems in New York City at both the building and microgrid scales. In aggregate, given the 2009 non-baseload GHG emissions factors for electricity production, these systems could reduce citywide GHG emissions by 10%. The operational characteristics of CHP systems were explored further considering different prime movers, climates, and GHG emissions factors. A combination of mixed integer linear programing and controlled random search algorithms were the methods used to determine the optimal capacity and operating strategies for the CHP systems under the various scenarios. Lastly a multi-regional unit commitment model of electricity and GHG emissions production for New York State was developed using data collected from several publicly available sources. The model was used to estimate average and marginal GHG emissions factors for New York State and New York City. The analysis found that marginal GHG emissions factors could reduce by 30% to 370 g CO2e/kWh in the next 10 years.

  2. Genome-wide identification of the regulatory targets of a transcription factor using biochemical characterization and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Jolly Emmitt R

    2005-11-01

    Full Text Available Abstract Background A major challenge in computational genomics is the development of methodologies that allow accurate genome-wide prediction of the regulatory targets of a transcription factor. We present a method for target identification that combines experimental characterization of binding requirements with computational genomic analysis. Results Our method identified potential target genes of the transcription factor Ndt80, a key transcriptional regulator involved in yeast sporulation, using the combined information of binding affinity, positional distribution, and conservation of the binding sites across multiple species. We have also developed a mathematical approach to compute the false positive rate and the total number of targets in the genome based on the multiple selection criteria. Conclusion We have shown that combining biochemical characterization and computational genomic analysis leads to accurate identification of the genome-wide targets of a transcription factor. The method can be extended to other transcription factors and can complement other genomic approaches to transcriptional regulation.

  3. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  4. Structural studies of formic acid using partial form-factor analysis

    International Nuclear Information System (INIS)

    Swan, G.; Dore, J.C.; Bellissent-Funel, M.C.

    1993-01-01

    Neutron diffraction measurements have been made of liquid formic acid using H/D isotopic substitution. Data are recorded for samples of DCOOD, HCOOD and a (H/D)COOD mixture (α D =0.36). A first-order difference method is used to determine the intra-molecular contribution through the introduction of a partial form-factor analysis technique incorporating a hydrogen-bond term. The method improves the sensitivity of the parameters defining the molecular geometry and avoids some of the ambiguities arising from terms involving spatial overlap of inter- and intra-molecular features. The possible application to other systems is briefly reviewed. (authors). 8 figs., 2 tabs., 8 refs

  5. Analysis on risk factors for post-stroke emotional incontinence

    Directory of Open Access Journals (Sweden)

    Xiao-chun ZHANG

    2018-01-01

    Full Text Available Objective To investigate the occurrence rate and related risk factors for post-stroke emotional incontinence (PSEI. Methods The clinical data [sex, age, body mass index (BMI, education, marital status, medical history (hypertension, heart disease, diabetes, hyperlipemia, smoking and drinking and family history of stroke] of 162 stroke patients were recorded. Serum homocysteine (Hcy level was examined. Head CT and/or MRI were used to indicate stroke subtype, site of lesion and number of lesion. Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-Ⅴ Chinese version and Hamilton Depression Rating Scale-17 Items (HAMD-17 were used to evaluate the degree of depression. House diagnostic standard was used to diagnose PSEI. Univariate and multivariate backward Logistic regression analysis was used to screen related risk factor for PSEI. Spearman rank correlation analysis was used to discuss the correlation between PSEI and post-stroke depression (PSD. Results Among 162 stroke patients, 12 cases were diagnosed as PSEI (7.41% . The ratio of age < 60 years in PSEI group was significantly higher than non-PSEI group (P = 0.045. The ratio of smoking in PSEI group was significantly lower than non-PSEI group (P = 0.036. Univariate and multivariate backward Logistic regression analysis showed age < 60 years was independent risk factor for PSEI (OR = 4.000, 95%CI: 1.149-13.924; P = 0.029. Ten cases were combined with PSD in 12 PSEI patients, and the co-morbidity rate of PSEI and PSD was83.33%. Spearman rank correlation analysis showed PSEI was positively related to PSD (rs = 0.305, P = 0.000. Conclusions PSEI is common affective disorder in stroke patients, which easily happens in patients under 60 years of age. DOI: 10.3969/j.issn.1672-6731.2017.12.010

  6. Analysis of factors affecting the effect of stope leaching

    International Nuclear Information System (INIS)

    Xie Wangnan; Dong Chunming

    2014-01-01

    The industrial test and industrial trial production of stope leaching were carried out at Taoshan orefield of Dabu deposit. The results of test and trial production showed obvious differences in leaching rate and leaching time. Compared with industrial trial production of stope leaching, the leaching rate of industrial test was higher, and leaching time was shorter. It was considered that the blasting method and liquid arrangement were the main factors affecting the leaching rate and leaching time according to analysis. So we put forward the following suggestions: the technique of deep hole slicing tight-face blasting was used to reduce the yield of lump ores, the effective liquid arrangement methods were adopted to make the lixiviant infiltrating throughout whole ore heap, and bacterial leaching was introduced. (authors)

  7. Factors affecting volume calculation with single photon emission tomography (SPECT) method

    International Nuclear Information System (INIS)

    Liu, T.H.; Lee, K.H.; Chen, D.C.P.; Ballard, S.; Siegel, M.E.

    1985-01-01

    Several factors may influence the calculation of absolute volumes (VL) from SPECT images. The effect of these factors must be established to optimize the technique. The authors investigated the following on the VL calculations: % of background (BG) subtraction, reconstruction filters, sample activity, angular sampling and edge detection methods. Transaxial images of a liver-trunk phantom filled with Tc-99m from 1 to 3 μCi/cc were obtained in 64x64 matrix with a Siemens Rota Camera and MDS computer. Different reconstruction filters including Hanning 20,32, 64 and Butterworth 20, 32 were used. Angular samplings were performed in 3 and 6 degree increments. ROI's were drawn manually and with an automatic edge detection program around the image after BG subtraction. VL's were calculated by multiplying the number of pixels within the ROI by the slice thickness and the x- and y- calibrations of each pixel. One or 2 pixel per slice thickness was applied in the calculation. An inverse correlation was found between the calculated VL and the % of BG subtraction (r=0.99 for 1,2,3 μCi/cc activity). Based on the authors' linear regression analysis, the correct liver VL was measured with about 53% BG subtraction. The reconstruction filters, slice thickness and angular sampling had only minor effects on the calculated phantom volumes. Detection of the ROI automatically by the computer was not as accurate as the manual method. The authors conclude that the % of BG subtraction appears to be the most important factor affecting the VL calculation. With good quality control and appropriate reconstruction factors, correct VL calculations can be achieved with SPECT

  8. Identification of dietary patterns using factor analysis in an epidemiological study in São Paulo

    Directory of Open Access Journals (Sweden)

    Dirce Maria Lobo Marchioni

    Full Text Available CONTEXT AND OBJECTIVE: Diet and nutrition are environmental factors in health/disease relationships. From the epidemiological viewpoint, diet represents a complex set of highly correlated exposures. Our objective was to identify patterns of food intake in a group of individuals living in São Paulo, and to develop objective dietary measurements for epidemiological purposes. DESIGN AND LOCAL: Exploratory factor analysis of data in a case-control study in seven teaching hospitals in São Paulo. METHODS: The participants were 517 patients (260 oral cancer cases and 257 controls admitted to the study hospitals between November 1998 and March 2001. The weekly intake frequencies for dairy products, cereals, meat, processed meat, vegetables, pulses, fruits and sweets were assessed by means of a semi-quantitative food frequency questionnaire. Dietary patterns were identified by factor analysis, based on the intake of the eight food groups, using principal component analysis as an extraction method followed by varimax rotation. RESULTS: Factor analysis identified three patterns that accounted for 55% of the total variability within the sample. The first pattern ("prudent" was characterized by vegetable, fruit and meat intake; the second ("traditional" by cereals (mainly rice and pulses (mainly beans; and the third ("snacks" by dairy products and processed meat. CONCLUSION: This study identified food intake patterns through an a posteriori approach. Such analysis may be useful for nutritional intervention programs and, after computing scores for each individual according to the patterns identified, for establishing a relationship between diet and other epidemiological measurements of interest.

  9. Factor analysis of Wechsler Adult Intelligence Scale-Revised in developmentally disabled persons.

    Science.gov (United States)

    Di Nuovo, Santo F; Buono, Serafino

    2006-12-01

    The results of previous studies on the factorial structure of Wechsler Intelligence Scales are somewhat inconsistent across normal and pathological samples. To study specific clinical groups, such as developmentally disabled persons, it is useful to examine the factor structure in appropriate samples. A factor analysis was carried out using the principal component method and the Varimax orthogonal rotation on the Wechsler Adult Intelligence Scale (WAIS-R) in a sample of 203 developmentally disabled persons, with a mean age of 25 years 4 months. Developmental disability ranged from mild to moderate. Partially contrasting with previous studies on normal samples, results found a two-factor solution. Wechsler's traditional Verbal and Performance scales seems to be more appropriate for this sample than the alternative three-factor solution.

  10. Classification analysis of organization factors related to system safety

    International Nuclear Information System (INIS)

    Liu Huizhen; Zhang Li; Zhang Yuling; Guan Shihua

    2009-01-01

    This paper analyzes the different types of organization factors which influence the system safety. The organization factor can be divided into the interior organization factor and exterior organization factor. The latter includes the factors of political, economical, technical, law, social culture and geographical, and the relationships among different interest groups. The former includes organization culture, communication, decision, training, process, supervision and management and organization structure. This paper focuses on the description of the organization factors. The classification analysis of the organization factors is the early work of quantitative analysis. (authors)

  11. Methods for determining radionuclide retardation factors: status report

    International Nuclear Information System (INIS)

    Relyea, J.F.; Serne, R.J.; Rai, D.

    1980-04-01

    This report identifies a number of mechanisms that retard radionuclide migration, and describes the static and dynamic methods that are used to study such retardation phenomena. Both static and dynamic methods are needed for reliable safety assessments of underground nuclear-waste repositories. This report also evaluates the extent to which the two methods may be used to diagnose radionuclide migration through various types of geologic media, among them unconsolidated, crushed, intact, and fractured rocks. Adsorption is one mechanism that can control radionuclide concentrations in solution and therefore impede radionuclide migration. Other mechanisms that control a solution's radionuclide concentration and radionuclide migration are precipitation of hydroxides and oxides, oxidation-reduction reactions, and the formation of minerals that might include the radionuclide as a structural element. The retardation mechanisms mentioned above are controlled by such factors as surface area, cation exchange capacity, solution pH, chemical composition of the rock and of the solution, oxidation-reduction potential, and radionuclide concentration. Rocks and ground waters used in determining retardation factors should represent the expected equilibrium conditions in the geologic system under investigation. Static test methods can be used to rapidly screen the effects of the factors mentioned above. Dynamic (or column) testing, is needed to assess the effects of hydrodynamics and the interaction of hydrodynamics with the other important parameters. This paper proposes both a standard method for conducting batch Kd determinations, and a standard format for organizing and reporting data. Dynamic testing methods are not presently developed to the point that a standard methodology can be proposed. Normal procedures are outlined for column experimentation and the data that are needed to analyze a column experiment are identified

  12. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  13. DPI-ELISA: a fast and versatile method to specify the binding of plant transcription factors to DNA in vitro

    Directory of Open Access Journals (Sweden)

    Chaban Christina

    2010-11-01

    Full Text Available Abstract Background About 10% of all genes in eukaryote genomes are predicted to encode transcription factors. The specific binding of transcription factors to short DNA-motifs influences the expression of neighbouring genes. However, little is known about the DNA-protein interaction itself. To date there are only a few suitable methods to characterise DNA-protein-interactions, among which the EMSA is the method most frequently used in laboratories. Besides EMSA, several protocols describe the effective use of an ELISA-based transcription factor binding assay e.g. for the analysis of human NFκB binding to specific DNA sequences. Results We provide a unified protocol for this type of ELISA analysis, termed DNA-Protein-Interaction (DPI-ELISA. Qualitative analyses with His-epitope tagged plant transcription factors expressed in E. coli revealed that EMSA and DPI-ELISA result in comparable and reproducible data. The binding of AtbZIP63 to the C-box and AtWRKY11 to the W2-box could be reproduced and validated by both methods. We next examined the physical binding of the C-terminal DNA-binding domains of AtWRKY33, AtWRKY50 and AtWRKY75 to the W2-box. Although the DNA-binding domain is highly conserved among the WRKY proteins tested, the use of the DPI-ELISA discloses differences in W2-box binding properties between these proteins. In addition to these well-studied transcription factor families, we applied our protocol to AtBPC2, a member of the so far uncharacterised plant specific Basic Pentacysteine transcription factor family. We could demonstrate binding to GA/TC-dinucleotide repeat motifs by our DPI-ELISA protocol. Different buffers and reaction conditions were examined. Conclusions We successfully applied our DPI-ELISA protocol to investigate the DNA-binding specificities of three different classes of transcription factors from Arabidopsis thaliana. However, the analysis of the binding affinity of any DNA-binding protein to any given DNA

  14. A meta-analysis of factors affecting trust in human-robot interaction.

    Science.gov (United States)

    Hancock, Peter A; Billings, Deborah R; Schaefer, Kristin E; Chen, Jessie Y C; de Visser, Ewart J; Parasuraman, Raja

    2011-10-01

    We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI). To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice. Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes. The overall correlational effect size for trust was r = +0.26,with an experimental effect size of d = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role. Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors. The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.

  15. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  16. [Analysis on influencing factor of the complications of percutaneous dilational tracheotomy].

    Science.gov (United States)

    Zhai, Xiang; Zhang, Jinling; Hang, Wei; Wang, Ming; Shi, Zhan; Mi, Yue; Hu, Yunlei; Liu, Gang

    2015-01-01

    To Analyze the influence factors on the complications of percutaneous dilational tracheotomy. Between August 2008 and February 2014, there were 3 450 patients with the indications of tracheotomy accepted percutaneous dilational tracheostomy, mainly using percutaneous dilational and percutaneous guide wire forceps in these cases. Statistical analysis was performed by SPSS 19.0 software on postoperative complications, the possible influence factors including age, gender, etiology, preoperative hypoxia, obesity, preoperative pulmonary infection, state of consciousness, operation method, operation doctor and whether with tracheal intubation. Among 3 450 patients, there were 164 cases with intraoperative or postoperative complications, including postoperative bleeding in 74 cases (2.14%), subcutaneous emphysema in 54 cases (1.57%), wound infection in 16 cases (0.46%), pneumothorax in 6 cases (0.17%), mediastinal emphysema in 5 cases (0.14%), operation failed and change to conventional incision in 4 cases (0.12%), tracheoesophageal fistula in 2 cases (0.06%), death in 3 cases(0.09%).Obesity, etiology, preoperative hypoxia, preoperative pulmonary infection, state of consciousness and operation method were the main influence factors, with significant statistical difference (χ(2) value was 0.010, 0.000, 0.002, 0.000, 0.000, 0.000, all P Gender, age, operation doctor and whether there was the endotracheal intubation were not the main influence factors. There was no significant statistical difference (P > 0.05). Although percutaneous dilational tracheostomy is safe, but the complications can also happen. In order to reduce the complications, it is need to pay attention to the factors of obesity, etiology, preoperative hypoxia, preoperative pulmonary infection, state of consciousness and operation method.

  17. Learning from environmental data: Methods for analysis of forest nutrition time series

    Energy Technology Data Exchange (ETDEWEB)

    Sulkava, M. (Helsinki Univ. of Technology, Espoo (Finland). Computer and Information Science)

    2008-07-01

    Data analysis methods play an important role in increasing our knowledge of the environment as the amount of data measured from the environment increases. This thesis fits under the scope of environmental informatics and environmental statistics. They are fields, in which data analysis methods are developed and applied for the analysis of environmental data. The environmental data studied in this thesis are time series of nutrient concentration measurements of pine and spruce needles. In addition, there are data of laboratory quality and related environmental factors, such as the weather and atmospheric depositions. The most important methods used for the analysis of the data are based on the self-organizing map and linear regression models. First, a new clustering algorithm of the self-organizing map is proposed. It is found to provide better results than two other methods for clustering of the self-organizing map. The algorithm is used to divide the nutrient concentration data into clusters, and the result is evaluated by environmental scientists. Based on the clustering, the temporal development of the forest nutrition is modeled and the effect of nitrogen and sulfur deposition on the foliar mineral composition is assessed. Second, regression models are used for studying how much environmental factors and properties of the needles affect the changes in the nutrient concentrations of the needles between their first and second year of existence. The aim is to build understandable models with good prediction capabilities. Sparse regression models are found to outperform more traditional regression models in this task. Third, fusion of laboratory quality data from different sources is performed to estimate the precisions of the analytical methods. Weighted regression models are used to quantify how much the precision of observations can affect the time needed to detect a trend in environmental time series. The results of power analysis show that improving the

  18. An Inverse Function Least Square Fitting Approach of the Buildup Factor for Radiation Shielding Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je [Sejong Univ., Seoul (Korea, Republic of); Alkhatee, Sari; Roh, Gyuhong; Lee, Byungchul [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Dose absorption and energy absorption buildup factors are widely used in the shielding analysis. The dose rate of the medium is main concern in the dose buildup factor, however energy absorption is an important parameter in the energy buildup factors. ANSI/ANS-6.4.3-1991 standard data is widely used based on interpolation and extrapolation by means of an approximation method. Recently, Yoshida's geometric progression (GP) formulae are also popular and it is already implemented in QAD code. In the QAD code, two buildup factors are notated as DOSE for standard air exposure response and ENG for the response of the energy absorbed in the material itself. In this paper, a new least square fitting method is suggested to obtain a reliable buildup factors proposed since 1991. Total 4 datasets of air exposure buildup factors are used for evaluation including ANSI/ANS-6.4.3-1991, Taylor, Berger, and GP data. The standard deviation of the fitted data are analyzed based on the results. A new reverse least square fitting method is proposed in this study in order to reduce the fitting uncertainties. It adapts an inverse function rather than the original function by the distribution slope of dataset. Some quantitative comparisons are provided for concrete and lead in this paper, too. This study is focused on the least square fitting of existing buildup factors to be utilized in the point-kernel code for radiation shielding analysis. The inverse least square fitting method is suggested to obtain more reliable results of concave shaped dataset such as concrete. In the concrete case, the variance and residue are decreased significantly, too. However, the convex shaped case of lead can be applied to the usual least square fitting method. In the future, more datasets will be tested by using the least square fitting. And the fitted data could be implemented to the existing point-kernel codes.

  19. Gravimetric and titrimetric methods of analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    Gravimetric and titrimetric methods of analysis are considered. Methods of complexometric titration are mentioned, as well as methods of increasing sensitivity in titrimetry. Gravimetry and titrimetry are applied during analysis for traces of geological materials

  20. Comparative analysis among several methods used to solve the point kinetic equations

    International Nuclear Information System (INIS)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da

    2007-01-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  1. Comparative analysis among several methods used to solve the point kinetic equations

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; E-mails: alupo@if.ufrj.br; agoncalves@con.ufrj.br; aquilino@lmp.ufrj.br; fernando@con.ufrj.br

    2007-07-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  2. Developing strategies to reduce the risk of hazardous materials transportation in iran using the method of fuzzy SWOT analysis

    Directory of Open Access Journals (Sweden)

    A. S. Kheirkhah

    2009-12-01

    Full Text Available An increase in hazardous materials transportation in Iran along with the industrial development and increase of resulted deadly accidents necessitate the development and implementation of some strategies to reduce these incidents. SWOT analysis is an efficient method for developing strategies, however, its structural problems, including a lack of prioritizing internal and external factors and inability to consider two sided factors reducing its performance in the situations where the number of internal and external factors affecting the risk of hazardous materials is relatively high and some factors are two sided in nature are presented in the article. Fuzzy SWOT analysis is a method the use of which helps with solving these problems and is the issue of employing an effective methodology. Also, the article compares the resulted strategies of the fuzzy method with the strategies developed following SWOT in order to show the relative supremacy of the new method.

  3. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  4. In-depth analysis of the causal factors of incidents reported in the Greek petrochemical industry

    Energy Technology Data Exchange (ETDEWEB)

    Konstandinidou, Myrto [Institute of Nuclear Technology-Radiation Protection, National Center for Scientific Research ' Demokritos' , Aghia Paraskevi 15310 (Greece); Nivolianitou, Zoe, E-mail: zoe@ipta.demokritos.gr [Institute of Nuclear Technology-Radiation Protection, National Center for Scientific Research ' Demokritos' , Aghia Paraskevi 15310 (Greece); Kefalogianni, Eirini; Caroni, Chrys [School of Applied Mathematical and Physical Sciences, National Technical University of Athens, 9 Iroon Polytexneiou Str., Zografou Campus, 157 80 Athens (Greece)

    2011-11-15

    This paper presents a statistical analysis of all reported incidents in the Greek petrochemical industry from 1997 to 2003. A comprehensive database has been developed to include industrial accidents (fires, explosions and substance releases), occupational accidents, incidents without significant consequences and near misses. The study concentrates on identifying and analyzing the causal factors related to different consequences of incidents, in particular, injury, absence from work and material damage. Methods of analysis include logistic regression with one of these consequences as dependent variable. The causal factors that are considered cover four major categories related to organizational issues, equipment malfunctions, human errors (of commission or omission) and external causes. Further analyses aim to confirm the value of recording near misses by comparing their causal factors with those of more serious incidents. The statistical analysis highlights the connection between the human factor and the underlying causes of accidents or incidents. - Highlights: > The research work is original, based on field data collected directly from the petrochemical industry. > It deals with the in-depth statistical analysis of accident data on human-organizational causes. > It researches underlying causes of accidents and the parameters affecting them. > The causal factors that are considered cover four big taxonomies. > Near misses are worth recording for comparing their causal factors with more serious incidents.

  5. Methods and analysis of factors impact on the efficiency of the photovoltaic generation

    International Nuclear Information System (INIS)

    Li Tianze; Zhang Xia; Jiang Chuan; Hou Luan

    2011-01-01

    First of all, the thesis elaborates two important breakthroughs which happened In the field of the application of solar energy in the 1950s.The 21st century the development of solar photovoltaic power generation will have the following characteristics: the continued high growth of industrial development, the significantly reducing cost of the solar cell, the large-scale high-tech development of photovoltaic industries, the breakthroughs of the film battery technology, the rapid development of solar PV buildings integration and combined to the grids. The paper makes principles of solar cells the theoretical analysis. On the basis, we study the conversion efficiency of solar cells, find the factors impact on the efficiency of the photovoltaic generation, solve solar cell conversion efficiency of technical problems through the development of new technology, and open up new ways to improve the solar cell conversion efficiency. Finally, the paper connecting with the practice establishes policies and legislation to the use of encourage renewable energy, development strategy, basic applied research etc.

  6. Methods and analysis of factors impact on the efficiency of the photovoltaic generation

    Science.gov (United States)

    Tianze, Li; Xia, Zhang; Chuan, Jiang; Luan, Hou

    2011-02-01

    First of all, the thesis elaborates two important breakthroughs which happened In the field of the application of solar energy in the 1950s.The 21st century the development of solar photovoltaic power generation will have the following characteristics: the continued high growth of industrial development, the significantly reducing cost of the solar cell, the large-scale high-tech development of photovoltaic industries, the breakthroughs of the film battery technology, the rapid development of solar PV buildings integration and combined to the grids. The paper makes principles of solar cells the theoretical analysis. On the basis, we study the conversion efficiency of solar cells, find the factors impact on the efficiency of the photovoltaic generation, solve solar cell conversion efficiency of technical problems through the development of new technology, and open up new ways to improve the solar cell conversion efficiency. Finally, the paper connecting with the practice establishes policies and legislation to the use of encourage renewable energy, development strategy, basic applied research etc.

  7. Using BMDP and SPSS for a Q factor analysis.

    Science.gov (United States)

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  8. A pragmatic approach to estimate alpha factors for common cause failure analysis

    International Nuclear Information System (INIS)

    Hassija, Varun; Senthil Kumar, C.; Velusamy, K.

    2014-01-01

    Highlights: • Estimation of coefficients in alpha factor model for common cause analysis. • A derivation of plant specific alpha factors is demonstrated. • We examine sensitivity of common cause contribution to total system failure. • We compare beta factor and alpha factor models for various redundant configurations. • The use of alpha factors is preferable, especially for large redundant systems. - Abstract: Most of the modern technological systems are deployed with high redundancy but still they fail mainly on account of common cause failures (CCF). Various models such as Beta Factor, Multiple Greek Letter, Binomial Failure Rate and Alpha Factor exists for estimation of risk from common cause failures. Amongst all, alpha factor model is considered most suitable for high redundant systems as it arrives at common cause failure probabilities from a set of ratios of failures and the total component failure probability Q T . In the present study, alpha factor model is applied for the assessment of CCF of safety systems deployed at two nuclear power plants. A method to overcome the difficulties in estimation of the coefficients viz., alpha factors in the model, importance of deriving plant specific alpha factors and sensitivity of common cause contribution to the total system failure probability with respect to hazard imposed by various CCF events is highlighted. An approach described in NUREG/CR-5500 is extended in this study to provide more explicit guidance for a statistical approach to derive plant specific coefficients for CCF analysis especially for high redundant systems. The procedure is expected to aid regulators for independent safety assessment

  9. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  10. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  11. Application of classification algorithms for analysis of road safety risk factor dependencies.

    Science.gov (United States)

    Kwon, Oh Hoon; Rhee, Wonjong; Yoon, Yoonjin

    2015-02-01

    Transportation continues to be an integral part of modern life, and the importance of road traffic safety cannot be overstated. Consequently, recent road traffic safety studies have focused on analysis of risk factors that impact fatality and injury level (severity) of traffic accidents. While some of the risk factors, such as drug use and drinking, are widely known to affect severity, an accurate modeling of their influences is still an open research topic. Furthermore, there are innumerable risk factors that are waiting to be discovered or analyzed. A promising approach is to investigate historical traffic accident data that have been collected in the past decades. This study inspects traffic accident reports that have been accumulated by the California Highway Patrol (CHP) since 1973 for which each accident report contains around 100 data fields. Among them, we investigate 25 fields between 2004 and 2010 that are most relevant to car accidents. Using two classification methods, the Naive Bayes classifier and the decision tree classifier, the relative importance of the data fields, i.e., risk factors, is revealed with respect to the resulting severity level. Performances of the classifiers are compared to each other and a binary logistic regression model is used as the basis for the comparisons. Some of the high-ranking risk factors are found to be strongly dependent on each other, and their incremental gains on estimating or modeling severity level are evaluated quantitatively. The analysis shows that only a handful of the risk factors in the data dominate the severity level and that dependency among the top risk factors is an imperative trait to consider for an accurate analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Efficiency limit factor analysis for the Francis-99 hydraulic turbine

    Science.gov (United States)

    Zeng, Y.; Zhang, L. X.; Guo, J. P.; Guo, Y. K.; Pan, Q. L.; Qian, J.

    2017-01-01

    The energy loss in hydraulic turbine is the most direct factor that affects the efficiency of the hydraulic turbine. Based on the analysis theory of inner energy loss of hydraulic turbine, combining the measurement data of the Francis-99, this paper calculates characteristic parameters of inner energy loss of the hydraulic turbine, and establishes the calculation model of the hydraulic turbine power. Taken the start-up test conditions given by Francis-99 as case, characteristics of the inner energy of the hydraulic turbine in transient and transformation law are researched. Further, analyzing mechanical friction in hydraulic turbine, we think that main ingredients of mechanical friction loss is the rotation friction loss between rotating runner and water body, and defined as the inner mechanical friction loss. The calculation method of the inner mechanical friction loss is given roughly. Our purpose is that explore and research the method and way increasing transformation efficiency of water flow by means of analysis energy losses in hydraulic turbine.

  13. A structured elicitation method to identify key direct risk factors for the management of natural resources

    Directory of Open Access Journals (Sweden)

    Michael Smith

    2015-11-01

    Full Text Available The high level of uncertainty inherent in natural resource management requires planners to apply comprehensive risk analyses, often in situations where there are few resources. In this paper, we demonstrate a broadly applicable, novel and structured elicitation approach to identify important direct risk factors. This new approach combines expert calibration and fuzzy based mathematics to capture and aggregate subjective expert estimates of the likelihood that a set of direct risk factors will cause management failure. A specific case study is used to demonstrate the approach; however, the described methods are widely applicable in risk analysis. For the case study, the management target was to retain all species that characterise a set of natural biological elements. The analysis was bounded by the spatial distribution of the biological elements under consideration and a 20-year time frame. Fourteen biological elements were expected to be at risk. Eleven important direct risk factors were identified that related to surrounding land use practices, climate change, problem species (e.g., feral predators, fire and hydrological change. In terms of their overall influence, the two most important risk factors were salinisation and a lack of water which together pose a considerable threat to the survival of nine biological elements. The described approach successfully overcame two concerns arising from previous risk analysis work: (1 the lack of an intuitive, yet comprehensive scoring method enabling the detection and clarification of expert agreement and associated levels of uncertainty; and (2 the ease with which results can be interpreted and communicated while preserving a rich level of detail essential for informed decision making.

  14. Derivation of weighting factors for cost and radiological impact for use in comparison of waste management methods

    International Nuclear Information System (INIS)

    Allen, P.T.; Lee, T.R.

    1991-01-01

    Nuclear waste management decisions are complex, and must include considerations of cost and social factors in addition to dose limitation. Decision-aiding techniques, such as multi-attribute analysis, can assist in structuring the problem and can icorporate as many factors, or attributes, as required. However, the relative weights of such attributes need to be established. Methods were devised which could be compared with one another. These were questionnaire-based but, in order to examine the possible influence of the measurement procedures on the results, two of the methods were combined in an experimental design. The two direct methods for obtaining weights (the conventional rating scales and the direct rating task) showed good agreement and yielded different values for separate social groups, such as industrial employees and lay public. The main conclusion is that the elicitation of weighting factors from the public is possible and that the resulting weights are meaningful and could have significant effects on the choice of waste management options

  15. Pathway-based factor analysis of gene expression data produces highly heritable phenotypes that associate with age.

    Science.gov (United States)

    Anand Brown, Andrew; Ding, Zhihao; Viñuela, Ana; Glass, Dan; Parts, Leopold; Spector, Tim; Winn, John; Durbin, Richard

    2015-03-09

    Statistical factor analysis methods have previously been used to remove noise components from high-dimensional data prior to genetic association mapping and, in a guided fashion, to summarize biologically relevant sources of variation. Here, we show how the derived factors summarizing pathway expression can be used to analyze the relationships between expression, heritability, and aging. We used skin gene expression data from 647 twins from the MuTHER Consortium and applied factor analysis to concisely summarize patterns of gene expression to remove broad confounding influences and to produce concise pathway-level phenotypes. We derived 930 "pathway phenotypes" that summarized patterns of variation across 186 KEGG pathways (five phenotypes per pathway). We identified 69 significant associations of age with phenotype from 57 distinct KEGG pathways at a stringent Bonferroni threshold ([Formula: see text]). These phenotypes are more heritable ([Formula: see text]) than gene expression levels. On average, expression levels of 16% of genes within these pathways are associated with age. Several significant pathways relate to metabolizing sugars and fatty acids; others relate to insulin signaling. We have demonstrated that factor analysis methods combined with biological knowledge can produce more reliable phenotypes with less stochastic noise than the individual gene expression levels, which increases our power to discover biologically relevant associations. These phenotypes could also be applied to discover associations with other environmental factors. Copyright © 2015 Brown et al.

  16. Statictical Analysis Of The Conditioning Factors Of Urban Electric Consumption

    International Nuclear Information System (INIS)

    Segura D'Rouville, Juan Joel; Suárez Carreño, Franyelit María

    2017-01-01

    This research work presents the analysis of the most important factors that condition the urban residential electricity consumption. This study shows the quantitative parameters conditioning the electricity consumption. This sector of analysis has been chosen because there is disaggregated information of which are the main social and technological factors that determine its behavior, growth, with the objective of elaborating policies in the management of the electric consumption. The electrical demand considered as the sum of the powers of all the equipment that are used in each of the instants of a full day, is related to the electrical consumption, which is not but the value of the power demanded by a determined consumer Multiplied by the time in which said demand is maintained. In this report we propose the design of a probabilistic model of prediction of electricity consumption, taking into account mainly influential social and technological factors. The statistical process of this database is done through the Stat Graphics software version 4.1, for its extensive didactic in the accomplishment of calculations and associated methods. Finally, the correlation of the variables was performed to classify the determinants in a specific way and thus to determine the consumption of the dwellings. (author)

  17. Quantifying human and organizational factors in accident management using decision trees: the HORAAM method

    International Nuclear Information System (INIS)

    Baumont, G.; Menage, F.; Schneiter, J.R.; Spurgin, A.; Vogel, A.

    2000-01-01

    In the framework of the level 2 Probabilistic Safety Study (PSA 2) project, the Institute for Nuclear Safety and Protection (IPSN) has developed a method for taking into account Human and Organizational Reliability Aspects during accident management. Actions are taken during very degraded installation operations by teams of experts in the French framework of Crisis Organization (ONC). After describing the background of the framework of the Level 2 PSA, the French specific Crisis Organization and the characteristics of human actions in the Accident Progression Event Tree, this paper describes the method developed to introduce in PSA the Human and Organizational Reliability Analysis in Accident Management (HORAAM). This method is based on the Decision Tree method and has gone through a number of steps in its development. The first one was the observation of crisis center exercises, in order to identify the main influence factors (IFs) which affect human and organizational reliability. These IFs were used as headings in the Decision Tree method. Expert judgment was used in order to verify the IFs, to rank them, and to estimate the value of the aggregated factors to simplify the quantification of the tree. A tool based on Mathematica was developed to increase the flexibility and the efficiency of the study

  18. Homotopy perturbation method for free vibration analysis of beams on elastic foundation

    International Nuclear Information System (INIS)

    Ozturk, Baki; Coskun, Safa Bozkurt; Koc, Mehmet Zahid; Atay, Mehmet Tarik

    2010-01-01

    In this study, the homotopy perturbation method (HPM) is applied for free vibration analysis of beam on elastic foundation. This numerical method is applied on a previously available case study. Analytical solutions and frequency factors are evaluated for different ratios of axial load N acting on the beam to Euler buckling load, N r . The application of HPM for the particular problem in this study gives results which are in excellent agreement with both analytical solutions and the variational iteration method (VIM) solutions for the case considered in this study and the differential transform method (DTM) results available in the literature.

  19. Selection of the Bank Investment Strategy on the Basis of the Hierarchy Analysis Method

    Directory of Open Access Journals (Sweden)

    Zhytar Maksym O.

    2014-02-01

    Full Text Available The goal of the article lies in identification of a methodical approach to selection of the investment strategy of banks on the basis of factors of its formation with the use of the hierarchy analysis method. Factors of formation of the bank’s investment strategy were identified in the result of the study. The article demonstrates that selection of the investment strategy of the bank could be efficiently realised on the basis of the hierarchy analysis method, which is the most popular under conditions of a multi-criteria assessment of the search for optimal solution of the set task. The article offers a hierarchical structure of decision making, which could be a basis of selection of the bank’s investment strategy with consideration of the institutional flexibility. The prospect of further study in this direction is development of an optimisation model of the bank’s investment portfolio with consideration of not only institutional, but also market flexibility of decision making.

  20. Using Module Analysis for Multiple Choice Responses: A New Method Applied to Force Concept Inventory Data

    Science.gov (United States)

    Brewe, Eric; Bruun, Jesper; Bearden, Ian G.

    2016-01-01

    We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…

  1. Analysis of multi lobe journal bearings with surface roughness using finite difference method

    Science.gov (United States)

    PhaniRaja Kumar, K.; Bhaskar, SUdaya; Manzoor Hussain, M.

    2018-04-01

    Multi lobe journal bearings are used for high operating speeds and high loads in machines. In this paper symmetrical multi lobe journal bearings are analyzed to find out the effect of surface roughnessduring non linear loading. Using the fourth order RungeKutta method, time transient analysis was performed to calculate and plot the journal centre trajectories. Flow factor method is used to evaluate the roughness and the finite difference method (FDM) is used to predict the pressure distribution over the bearing surface. The Transient analysis is done on the multi lobe journal bearings for threedifferent surface roughness orientations. Longitudinal surface roughness is more effective when compared with isotopic and traverse surface roughness.

  2. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  3. Analysis of technological, institutional and socioeconomic factors ...

    African Journals Online (AJOL)

    Analysis of technological, institutional and socioeconomic factors that influences poor reading culture among secondary school students in Nigeria. ... Proliferation and availability of smart phones, chatting culture and social media were identified as technological factors influencing poor reading culture among secondary ...

  4. Factorization method for simulating QCD at finite density

    International Nuclear Information System (INIS)

    Nishimura, Jun

    2003-01-01

    We propose a new method for simulating QCD at finite density. The method is based on a general factorization property of distribution functions of observables, and it is therefore applicable to any system with a complex action. The so-called overlap problem is completely eliminated by the use of constrained simulations. We test this method in a Random Matrix Theory for finite density QCD, where we are able to reproduce the exact results for the quark number density. (author)

  5. Exploratory Analysis of the Factors Affecting Consumer Choice in E-Commerce: Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Elena Mazurova

    2017-05-01

    Full Text Available According to previous studies of online consumer behaviour, three factors are the most influential on purchasing behavior - brand, colour and position of the product on the screen. However, a simultaneous influence of these three factors on the consumer decision making process has not been investigated previously. In this particular work we aim to execute a comprehensive study of the influence of these three factors. In order to answer our main research questions, we conducted an experiment with 96 different combinations of the three attributes, and using statistical analysis, such as conjoint analysis, t-test analysis and Kendall analysis we identified that the most influential factor to the online consumer decision making process is brand, the second most important attribute is the colour, which was estimated half as important as brand, and the least important attribute is the position on the screen. Additionally, we identified the main differences regarding consumers stated and revealed preferences regarding these three attributes.

  6. A factor analysis to find critical success factors in retail brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in city of Tehran, Iran. Using a sample of 265 people from regular customers, the study uses factor analysis and extracts four main factors including related brand, product benefits, customer welfare strategy and corporate profits using the existing 31 factors in the literature.

  7. Seismic design and analysis methods

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1993-01-01

    Seismic load is in many areas of the world the most important loading situation from the point of view of structural strength. Taking this into account it is understandable, that there has been a strong allocation of resources in the seismic analysis during the past ten years. In this study there are three areas of the center of gravity: (1) Random vibrations; (2) Soil-structure interaction and (3) The methods for determining structural response. The solution of random vibration problems is clarified with the aid of applications in this study and from the point of view of mathematical treatment and mathematical formulations it is deemed sufficient to give the relevant sources. In the soil-structure interaction analysis the focus has been the significance of frequency dependent impedance functions. As a result it was obtained, that the description of the soil with the aid of frequency dependent impedance functions decreases the structural response and it is thus always the preferred method when compared to more conservative analysis types. From the methods to determine the C structural response the following four were tested: (1) The time history method; (2) The complex frequency-response method; (3) Response spectrum method and (4) The equivalent static force method. The time history appeared to be the most accurate method and the complex frequency-response method did have the widest area of application. (orig.). (14 refs., 35 figs.)

  8. Research on the factors of return on equity: empirical analysis in Chinese port industries from 2000-2008

    Science.gov (United States)

    Li, Wei

    2012-01-01

    Port industries are the basic industries in the national economy. The industries have become the most modernized departments in every country. The development of the port industry is not only advantageous to promote the optimizing arrangement of social resources, but also to promote the growth of foreign trade volume through enhancing the transportation functions. Return on equity (ROE) is a direct indicator related to the maximization of company's wealth. It makes up the shortcomings of earnings per share (EPS). The aim of this paper is to prove the correlation between ROE and other financial indicators by choosing the listed port companies as the research objectives and selecting the data of these companies from 2000 to 2008 as empirical sample data with statistical analysis of the chartered figure and coefficient. The detailed analysis method used in the paper is the combination of trend analysis, comparative analysis and the ratio of the factor analysis method. This paper analyzes and compares all these factors and draws the conclusions as follows: Firstly, ROE has a positive correlation with total assets turnover, main profit margin and fixed asset ratio, while has a negative correlation with assets liabilities ratio, total assets growth rate and DOL. Secondly, main profit margin has the greatest positive effect on ROE among all these factors. The second greatest factor is total assets turnover, which shows the operation capacity is also an important indicator after the profitability. Thirdly, assets liabilities ratio has the greatest negative effect on ROE among all these factors.

  9. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  10. Factors Influencing Achievement in Undergraduate Social Science Research Methods Courses: A Mixed Methods Analysis

    Science.gov (United States)

    Markle, Gail

    2017-01-01

    Undergraduate social science research methods courses tend to have higher than average rates of failure and withdrawal. Lack of success in these courses impedes students' progression through their degree programs and negatively impacts institutional retention and graduation rates. Grounded in adult learning theory, this mixed methods study…

  11. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  12. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  13. Innovative classification of methods of the Future-oriented Technology Analysis

    OpenAIRE

    HALICKA, Katarzyna

    2016-01-01

    In the era characterized by significant dynamics of the environment traditional methods of anticipating the future, assuming the immutability of the factors affecting the forecasted phenomenon, may be in the long term ineffective. The modern approach of predicting the future of technology, taking into account the multidimensionality of the environment, is, among other things, the Future-Oriented Technology Analysis (FTA). Designing the FTA research procedure is a complex process, both in orga...

  14. Toward a regional power plant siting method: AEC-Maryland regional siting factors study, FY 1974 progress report

    International Nuclear Information System (INIS)

    Yaffee, S.L.; Miller, C.A.

    1974-11-01

    The ''AEC-Maryland Regional Siting Factors Study'' examines the process of siting in a regional context. It is developing an analysis method to delineate candidate areas for siting of several power plant technology packages, including both fossil-fueled and nuclear options. Tools that are being used include simulation modeling, economic and demographic forecasting, spatial analysis, and computer graphics and numerical manipulation. The approach will describe the trade-offs incurred if a power plant is located in one candidate area rather than in another. In FY 1974, a suitability analysis method was developed which uses engineering and environmental parameters to define a level of environmental cost incurred if a segment of land is used to site a specific technology package. (U.S.)

  15. Studies on thermal neutron perturbation factor needed for bulk sample activation analysis

    CERN Document Server

    Csikai, J; Sanami, T; Michikawa, T

    2002-01-01

    The spatial distribution of thermal neutrons produced by an Am-Be source in a graphite pile was measured via the activation foil method. The results obtained agree well with calculated data using the MCNP-4B code. A previous method used for the determination of the average neutron flux within thin absorbing samples has been improved and extended for a graphite moderator. A procedure developed for the determination of the flux perturbation factor renders the thermal neutron activation analysis of bulky samples of unknown composition possible both in hydrogenous and graphite moderators.

  16. Considering induction factor using BEM method in wind farm layout optimization

    DEFF Research Database (Denmark)

    Ghadirian, Amin; Dehghan, M.; Torabi, F.

    2014-01-01

    For wind farm layout optimization process, a simple linear model has been mostly used for considering the wake effect of a wind turbine on its downstream turbines. In this model, the wind velocity in the wake behind a turbine is obtained as a function of turbine induction factor which...... was considered to be 0.324 almost in all the previous studies. However, it is obviously evident that this factor is a strong function of turbine blade geometry and operational conditions. In the present study, a new method is introduced by which the induction factor for wind turbines can be calculated based...... on the method of Blade Element Momentum theory. By this method, the effect of blade profile, wind speed and angular velocity of wind turbine on the induction factor can be easily taken into account. The results show that for different blade profiles and operational conditions, the induction factor differs from...

  17. Analysis of factors that influencing the interest of Bali State Polytechnic’s students in entrepreneurship

    Science.gov (United States)

    Ayuni, N. W. D.; Sari, I. G. A. M. K. K.

    2018-01-01

    The high rate of unemployment results the economic growth to be hampered. To solve this situation, the government try to change the students’ mindset from becoming a job seeker to become a job creator or entrepreneur. One real action that usually been held in Bali State Polytechnic is Student Entrepreneurial Program. The purpose of this research is to identify and analyze the factors that influence the interest of Bali State Polytechnic’s Students in entrepreneurship, especially in the Entrepreneurial Student Program. Method used in this research is Factor Analysis including Bartlett Test, Kaiser-Mayer Olkin (KMO), Measure of Sampling Adequacy (MSA), factor extraction using Principal Component Analysis (PCA), factor selection using eigen value and scree plot, and factor rotation using orthogonal rotation varimax. Result shows that there are four factors that influencing the interest of Bali State Polytechnic’s Students in Entrepreneurship which are Contextual Factor (including Entrepreneurship Training, Academic Support, Perceived Confidence, and Economic Challenge), Self Efficacy Factor (including Leadership, Mental Maturity, Relation with Entrepreneur, and Authority), Subjective Norm Factor (including Support of Important Relative, Support of Friends, and Family Role), and Attitude Factor (including Self Realization).

  18. An Enhanced Factor Analysis of Performance Degradation Assessment on Slurry Pump Impellers

    Directory of Open Access Journals (Sweden)

    Shilong Sun

    2017-01-01

    Full Text Available Slurry pumps, such as oil sand pumps, are widely used in industry to convert electrical energy to slurry potential and kinetic energy. Because of adverse working conditions, slurry pump impellers are prone to suffer wear, which may result in slurry pump breakdowns. To prevent any unexpected breakdowns, slurry pump impeller performance degradation assessment should be immediately conducted to monitor the current health condition and to ensure the safety and reliability of slurry pumps. In this paper, to provide an alternative to the impeller health indicator, an enhanced factor analysis based impeller indicator (EFABII is proposed. Firstly, a low-pass filter is employed to improve the signal to noise ratios of slurry pump vibration signals. Secondly, redundant statistical features are extracted from the filtered vibration signals. To reduce the redundancy of the statistic features, the enhanced factor analysis is performed to generate new statistical features. Moreover, the statistic features can be automatically grouped and developed a new indicator called EFABII. Data collected from industrial oil sand pumps are used to validate the effectiveness of the proposed method. The results show that the proposed method is able to track the current health condition of slurry pump impellers.

  19. Nonstationary Hydrological Frequency Analysis: Theoretical Methods and Application Challenges

    Science.gov (United States)

    Xiong, L.

    2014-12-01

    Because of its great implications in the design and operation of hydraulic structures under changing environments (either climate change or anthropogenic changes), nonstationary hydrological frequency analysis has become so important and essential. Two important achievements have been made in methods. Without adhering to the consistency assumption in the traditional hydrological frequency analysis, the time-varying probability distribution of any hydrological variable can be established by linking the distribution parameters to some covariates such as time or physical variables with the help of some powerful tools like the Generalized Additive Model of Location, Scale and Shape (GAMLSS). With the help of copulas, the multivariate nonstationary hydrological frequency analysis has also become feasible. However, applications of the nonstationary hydrological frequency formula to the design and operation of hydraulic structures for coping with the impacts of changing environments in practice is still faced with many challenges. First, the nonstationary hydrological frequency formulae with time as covariate could only be extrapolated for a very short time period beyond the latest observation time, because such kind of formulae is not physically constrained and the extrapolated outcomes could be unrealistic. There are two physically reasonable methods that can be used for changing environments, one is to directly link the quantiles or the distribution parameters to some measureable physical factors, and the other is to use the derived probability distributions based on hydrological processes. However, both methods are with a certain degree of uncertainty. For the design and operation of hydraulic structures under changing environments, it is recommended that design results of both stationary and nonstationary methods be presented together and compared with each other, to help us understand the potential risks of each method.

  20. An integrating factor matrix method to find first integrals

    International Nuclear Information System (INIS)

    Saputra, K V I; Quispel, G R W; Van Veen, L

    2010-01-01

    In this paper we develop an integrating factor matrix method to derive conditions for the existence of first integrals. We use this novel method to obtain first integrals, along with the conditions for their existence, for two- and three-dimensional Lotka-Volterra systems with constant terms. The results are compared to previous results obtained by other methods.

  1. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  2. Factor analysis for exercise stress radionuclide ventriculography

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Yasuda, Mitsutaka; Oku, Hisao; Ikuno, Yoshiyasu; Takeuchi, Kazuhide; Takeda, Tadanao; Ochi, Hironobu

    1987-01-01

    Using factor analysis, a new image processing in exercise stress radionuclide ventriculography, changes in factors associated with exercise were evaluated in 14 patients with angina pectoris or old myocardial infarction. The patients were imaged in the left anterior oblique projection, and three factor images were presented on a color coded scale. Abnormal factors (AF) were observed in 6 patients before exercise, 13 during exercise, and 4 after exercise. In 7 patients, the occurrence of AF was associated with exercise. Five of them became free from AF after exercise. Three patients showing AF before exercise had aggravation of AF during exercise. Overall, the occurrence or aggravation of AF was associated with exercise in ten (71 %) of the patients. The other three patients, however, had disappearance of AF during exercise. In the last patient, none of the AF was observed throughout the study. In view of a high incidence of AF associated with exercise, the factor analysis may have the potential in evaluating cardiac reverse from the viewpoint of left ventricular wall motion abnormality. (Namekawa, K.)

  3. In-depth analysis of the causal factors of incidents reported in the Greek petrochemical industry

    International Nuclear Information System (INIS)

    Konstandinidou, Myrto; Nivolianitou, Zoe; Kefalogianni, Eirini; Caroni, Chrys

    2011-01-01

    This paper presents a statistical analysis of all reported incidents in the Greek petrochemical industry from 1997 to 2003. A comprehensive database has been developed to include industrial accidents (fires, explosions and substance releases), occupational accidents, incidents without significant consequences and near misses. The study concentrates on identifying and analyzing the causal factors related to different consequences of incidents, in particular, injury, absence from work and material damage. Methods of analysis include logistic regression with one of these consequences as dependent variable. The causal factors that are considered cover four major categories related to organizational issues, equipment malfunctions, human errors (of commission or omission) and external causes. Further analyses aim to confirm the value of recording near misses by comparing their causal factors with those of more serious incidents. The statistical analysis highlights the connection between the human factor and the underlying causes of accidents or incidents. - Highlights: → The research work is original, based on field data collected directly from the petrochemical industry. → It deals with the in-depth statistical analysis of accident data on human-organizational causes. → It researches underlying causes of accidents and the parameters affecting them. → The causal factors that are considered cover four big taxonomies. → Near misses are worth recording for comparing their causal factors with more serious incidents.

  4. [Contribution of lifestyle factors to cancer: secondary analysis of Dutch data over 2010 and a projection for 2020

    NARCIS (Netherlands)

    Lanting, C.I.; Vroome, E.M. de; Elias, S.G.; Brandt, P.A. van den; Leeuwen, F.E. van; Kampman, E.; Kiemeney, L.A.L.M.; Peeters, P.H.M.; Vries, E de; Bausch-Goldbohm, R.A.

    2014-01-01

    OBJECTIVE: To calculate the proportion of cancer cases in the Netherlands in 2010 that were attributable to lifestyle factors by using the most recent data. DESIGN: Secondary analysis. METHOD: Lifestyle risk factors studied were tobacco smoking, alcohol consumption, overweight, lack of physical

  5. Analysis of risk factors for non-anastomotic biliary stricture following liver transplantation

    Directory of Open Access Journals (Sweden)

    WU Xiaofeng

    2013-06-01

    Full Text Available ObjectiveTo investigate the risk factors for non-anastomotic biliary stricture (NABS following liver transplantation. MethodsA retrospective analysis was performed on 175 patients who underwent liver transplantation from January 2004 to December 2010 to analyze the risk factors for NABS, which included sex, age, primary disease, blood type, T-tube placement, acute rejection, biliary tract infection, cytomegalovirus infection, Child-Pugh score, cold ischemia time, warm ischemia time, duration of anhepatic phase, and mean hepatic artery blood flow within one week after operation. These patients were divided into early group, who underwent operation from January 2004 to December 2006, and late group, who underwent operation from January 2007 to December 2010; each group was further divided into two subgroups according to whether they developed NABS. The risk factors for NABS were determined by univariate and multivariate logistic regression analyses. ResultsThe univariate logistic regression analysis showed that the risk factors for NABS were biliary tract infection, T-tube placement, and acute rejection in the early group (P<0.05 and that acute rejection was the risk factor in the late group (P=0003. The multivariate logistic regression analysis showed that acute rejection was significantly associated with NABS in the early group (P=0.014. ConclusionThe risk factors for NABS following liver transplantation from January 2004 to December 2006; biliary tract infection and T-tube placement could be prevented by perioperative interventions, thus reducing the incidence of NABS. The incidence of acute rejection was reduced from January 2007 to December 2010, but it was still significantly associated with NABS.

  6. A hybrid method for quasi-three-dimensional slope stability analysis in a municipal solid waste landfill

    International Nuclear Information System (INIS)

    Yu, L.; Batlle, F.

    2011-01-01

    Highlights: → A quasi-three-dimensional slope stability analysis method was proposed. → The proposed method is a good engineering tool for 3D slope stability analysis. → Factor of safety from 3D analysis is higher than from 2D analysis. → 3D analysis results are more sensitive to cohesion than 2D analysis. - Abstract: Limited space for accommodating the ever increasing mounds of municipal solid waste (MSW) demands the capacity of MSW landfill be maximized by building landfills to greater heights with steeper slopes. This situation has raised concerns regarding the stability of high MSW landfills. A hybrid method for quasi-three-dimensional slope stability analysis based on the finite element stress analysis was applied in a case study at a MSW landfill in north-east Spain. Potential slides can be assumed to be located within the waste mass due to the lack of weak foundation soils and geosynthetic membranes at the landfill base. The only triggering factor of deep-seated slope failure is the higher leachate level and the relatively high and steep slope in the front. The valley-shaped geometry and layered construction procedure at the site make three-dimensional slope stability analyses necessary for this landfill. In the finite element stress analysis, variations of leachate level during construction and continuous settlement of the landfill were taken into account. The 'equivalent' three-dimensional factor of safety (FoS) was computed from the individual result of the two-dimensional analysis for a series of evenly spaced cross sections within the potential sliding body. Results indicate that the hybrid method for quasi-three-dimensional slope stability analysis adopted in this paper is capable of locating roughly the spatial position of the potential sliding mass. This easy to manipulate method can serve as an engineering tool in the preliminary estimate of the FoS as well as the approximate position and extent of the potential sliding mass. The result that

  7. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  8. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  9. The surface analysis methods; Les methodes d`analyse des surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Deville, J.P. [Institut de Physique et Chimie, 67 - Strasbourg (France)

    1998-11-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.) 11 refs.

  10. Analysis of Prognostic Factors After Yttrium-90 Radioembolization of Advanced Hepatocellular Carcinoma

    International Nuclear Information System (INIS)

    Inarrairaegui, Mercedes; Martinez-Cuesta, Antonio; Rodriguez, Macarena; Bilbao, J. Ignacio

    2010-01-01

    Purpose: To analyze which patient-, tumor-, and treatment-related factors may influence outcome after 90 Y radioembolization ( 90 Y-RE) for hepatocellular carcinoma (HCC). Patients and Methods: Seventy-two consecutive patients with advanced HCC treated with 90 Y-RE were studied to detect which factors may have influenced response to treatment and survival. Results: Median overall survival was 13 months (95% confidence interval, 9.6-16.3 months). In univariate analysis, survival was significantly better in patients with one to five lesions (19 vs. 8 months, p = 0.001) and in patients with alpha-fetoprotein 52 UI/mL, and their survival in the multivariate analysis was significantly worse (hazard ratio, 4.7; 95% confidence interval, 13-1.73) (p = 0.002). Conclusions: Yttrium-90 radioembolization results in control of target lesions in the majority of patients with HCC but does not prevent the development of new lesions. Survival of patients treated with 90 Y-RE seems to depend largely on factors related to the aggressiveness of the disease (number of nodules, levels of alpha-fetoprotein, and presence of microscopic disease).

  11. Hand function evaluation: a factor analysis study.

    Science.gov (United States)

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  12. Salivary SPECT and factor analysis in Sjoegren's syndrome

    International Nuclear Information System (INIS)

    Nakamura, T.; Oshiumi, Y.; Yonetsu, K.; Muranaka, T.; Sakai, K.; Kanda, S.; National Fukuoka Central Hospital

    1991-01-01

    Salivary SPECT and factor analysis in Sjoegren's syndrome were performed in 17 patients and 6 volunteers as controls. The ability of SPECT to detect small differences in the level of uptake can be used to separate glands from background even when uptake is reduced as in the patients with Sjoegren's syndrome. In control and probable Sjoegren's syndrome groups the uptake ratio of the submandibular gland to parotid gland on salivary SPECT (S/P ratio) was less than 1.0. However, in the definite Sjoergren's syndrome group, the ratio was more than 1.0. Moreover, the ratio in all patients with sialectasia, which is characteristic of Sjoegren's syndrome, was more than 1.0. Salivary factor analysis of normal parotid glands showed slowly increasing patterns of uptake and normal submandibular glands had rapidly increasing patterns of uptake. However, in the definite Sjoegren's syndrome group, the factor analysis patterns were altered, with slowly increasing patterns dominating both in the parotid and submandibular glands. These results suggest that the S/P ratio in salivary SPECT and salivary factor analysis provide additional radiologic criteria in diagnosing Sjoegren's syndrome. (orig.)

  13. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  14. Information technology portfolio in supply chain management using factor analysis

    Directory of Open Access Journals (Sweden)

    Ahmad Jaafarnejad

    2013-11-01

    Full Text Available The adoption of information technology (IT along with supply chain management (SCM has become increasingly a necessity among most businesses. This enhances supply chain (SC performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal component analysis (PCA of factor analysis (FA, a number of related criteria are divided into smaller groups. Finally, SC managers can develop an IT portfolio in SCM using mean values of few extracted components on the relevance –emergency matrix. A numerical example is provided to explain details of the proposed method.

  15. Reduced angiogenic factor expression in intrauterine fetal growth restriction using semiquantitative immunohistochemistry and digital image analysis.

    Science.gov (United States)

    Alahakoon, Thushari I; Zhang, Weiyi; Arbuckle, Susan; Zhang, Kewei; Lee, Vincent

    2018-05-01

    To localize, quantify and compare angiogenic factors, vascular endothelial growth factor (VEGF), placental growth factor (PlGF), as well as their receptors fms-like tyrosine kinase receptor (Flt-1) and kinase insert domain receptor (KDR) in the placentas of normal pregnancy and complications of preeclampsia (PE), intrauterine fetal growth restriction (IUGR) and PE + IUGR. In a prospective cross-sectional case-control study, 30 pregnant women between 24-40 weeks of gestation, were recruited into four clinical groups. Representative placental samples were stained for VEGF, PlGF, Flt-1 and KDR. Analysis was performed using semiquantitative methods and digital image analysis. The overall VEGF and Flt-1 were strongly expressed and did not show any conclusive difference in the expression between study groups. PlGF and KDR were significantly reduced in expression in the placentas from pregnancies complicated by IUGR compared with normal and preeclamptic pregnancies. The lack of PlGF and KDR may be a cause for the development of IUGR and may explain the loss of vasculature and villous architecture in IUGR. Automated digital image analysis software is a viable alternative method to the manual reading of placental immunohistochemical staining. © 2018 Japan Society of Obstetrics and Gynecology.

  16. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  17. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  18. A comparison of cosegregation analysis methods for the clinical setting.

    Science.gov (United States)

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  19. The performance shaping factors influence analysis on the human reliability for NPP operation

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Florescu, G.

    2008-01-01

    The Human Reliability Analysis (HRA) is an important step in Probabilistic Safety Assessment (PSA) studies and offers an advisability for concrete improvement of the man - machine - organization interfaces, reliability and safety. The goals of this analysis are to obtain sufficient details in order to understand and document all-important factors that affect human performance. The purpose of this paper is to estimate the human errors probabilities in view of the negative or positive effect of the human performance shaping factors (PSFs) for the mitigation of the initiating events which could occur in Nuclear Power Plant (NPP). Using THERP and SPAR-H methods, an analysis model of PSFs influence on the human reliability is performed. This model is applied to more important activities, that are necessary to mitigate 'one steam generator tube failure' event at Cernavoda NPP. The results are joint human error probabilities (JHEP) values estimated for the following situations: without regarding to PSFs influence; with PSFs in specific conditions; with PSFs which could have only positive influence and with PSFs which could have only negative influence. In addition, PSFs with negative influence were identified and using the DOE method, the necessary activities for changing negative influence were assigned. (authors)

  20. Determining of factors influencing the success and failure of hospital information system and their evaluation methods: a systematic review.

    Science.gov (United States)

    Sadoughi, Farahnaz; Kimiafar, Khalil; Ahmadi, Maryam; Shakeri, Mohammad Taghi

    2013-12-01

    Nowadays, using new information technology (IT) has provided remarkable opportunities to decrease medical errors, support health care specialist, increase the efficiency and even the quality of patient's care and safety. The purpose of this study was the identification of Hospital Information System (HIS) success and failure factors and the evaluation methods of these factors. This research emphasizes the need to a comprehensive evaluation of HISs which considers a wide range of success and failure factors in these systems. We searched for relevant English language studies based on keywords in title and abstract, using PubMed, Ovid Medline (by applying MeSH terms), Scopus, ScienceDirect and Embase (earliest entry to march 17, 2012). Studies which considered success models and success or failure factors, or studied the evaluation models of HISs and the related ones were chosen. Since the studies used in this systematic review were heterogeneous, the combination of extracted data was carried out by using narrative synthesis method. We found 16 articles which required detailed analysis. Finally, the suggested framework includes 12 main factors (functional, organizational, behavioral, cultural, management, technical, strategy, economy, education, legal, ethical and political factors), 67 sub factors, and 33 suggested methods for the evaluation of these sub factors. The results of the present research indicates that the emphasis of the HIS evaluation moves from technical subjects to human and organizational subjects, and from objective to subjective issues. Therefore, this issue entails more familiarity with more qualitative evaluation methods. In most of the reviewed studies, the main focus has been laid on the necessity of using multi-method approaches and combining methods to obtain more comprehensive and useful results.

  1. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  2. Multiple Statistical Models Based Analysis of Causative Factors and Loess Landslides in Tianshui City, China

    Science.gov (United States)

    Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong

    2018-03-01

    Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.

  3. Different anthropometric adiposity measures and their association with cardiovascular disease risk factors: a meta-analysis

    OpenAIRE

    van Dijk, S. B.; Takken, T.; Prinsen, E. C.; Wittink, H.

    2012-01-01

    Objectives To investigate which anthropometric adiposity measure has the strongest association with cardiovascular disease (CVD) risk factors in Caucasian men and women without a history of CVD. Design Systematic review and meta-analysis. Methods We searched databases for studies reporting correlations between anthropometric adiposity measures and CVD risk factors in Caucasian subjects without a history of CVD. Body mass index (BMI), waist circumference, waist-to-hip ratio, waist-to-height ra...

  4. Fast and accurate methods of independent component analysis: A survey

    Czech Academy of Sciences Publication Activity Database

    Tichavský, Petr; Koldovský, Zbyněk

    2011-01-01

    Roč. 47, č. 3 (2011), s. 426-438 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : Blind source separation * artifact removal * electroencephalogram * audio signal processing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/tichavsky-fast and accurate methods of independent component analysis a survey.pdf

  5. Application of texture analysis method for mammogram density classification

    Science.gov (United States)

    Nithya, R.; Santhi, B.

    2017-07-01

    Mammographic density is considered a major risk factor for developing breast cancer. This paper proposes an automated approach to classify breast tissue types in digital mammogram. The main objective of the proposed Computer-Aided Diagnosis (CAD) system is to investigate various feature extraction methods and classifiers to improve the diagnostic accuracy in mammogram density classification. Texture analysis methods are used to extract the features from the mammogram. Texture features are extracted by using histogram, Gray Level Co-Occurrence Matrix (GLCM), Gray Level Run Length Matrix (GLRLM), Gray Level Difference Matrix (GLDM), Local Binary Pattern (LBP), Entropy, Discrete Wavelet Transform (DWT), Wavelet Packet Transform (WPT), Gabor transform and trace transform. These extracted features are selected using Analysis of Variance (ANOVA). The features selected by ANOVA are fed into the classifiers to characterize the mammogram into two-class (fatty/dense) and three-class (fatty/glandular/dense) breast density classification. This work has been carried out by using the mini-Mammographic Image Analysis Society (MIAS) database. Five classifiers are employed namely, Artificial Neural Network (ANN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), K-Nearest Neighbor (KNN), and Support Vector Machine (SVM). Experimental results show that ANN provides better performance than LDA, NB, KNN and SVM classifiers. The proposed methodology has achieved 97.5% accuracy for three-class and 99.37% for two-class density classification.

  6. The annual averaged atmospheric dispersion factor and deposition factor according to methods of atmospheric stability classification

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Hae Sun; Jeong, Hyo Joon; Kim, Eun Han; Han, Moon Hee; Hwang, Won Tae [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-09-15

    This study analyzes the differences in the annual averaged atmospheric dispersion factor and ground deposition factor produced using two classification methods of atmospheric stability, which are based on a vertical temperature difference and the standard deviation of horizontal wind direction fluctuation. Daedeok and Wolsong nuclear sites were chosen for an assessment, and the meteorological data at 10 m were applied to the evaluation of atmospheric stability. The XOQDOQ software program was used to calculate atmospheric dispersion factors and ground deposition factors. The calculated distances were chosen at 400 m, 800 m, 1,200 m, 1,600 m, 2,400 m, and 3,200 m away from the radioactive material release points. All of the atmospheric dispersion factors generated using the atmospheric stability based on the vertical temperature difference were shown to be higher than those from the standard deviation of horizontal wind direction fluctuation. On the other hand, the ground deposition factors were shown to be same regardless of the classification method, as they were based on the graph obtained from empirical data presented in the Nuclear Regulatory Commission's Regulatory Guide 1.111, which is unrelated to the atmospheric stability for the ground level release. These results are based on the meteorological data collected over the course of one year at the specified sites; however, the classification method of atmospheric stability using the vertical temperature difference is expected to be more conservative.

  7. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  8. Discriminant analysis method to determine the power of the boys 11-12 year

    Directory of Open Access Journals (Sweden)

    Mirosława Cieślicka

    2016-10-01

    Full Text Available Purpose: To determine the model of power in boys 11-12 years old. Material and methods: To achieve the objectives, the following methods: analysis of scientific literature, statistical methods for analysis of results. The study involved 35 boys 11 year (n = 35 and 32 boys 12 year (n = 32. Results: Analysis of the results shows that the statistical significance of differences in the test results of boys 11 and 12 years there has been research jump from the place of execution and the amount of squats (the amount of execution time (p <0.001, p <0. Conclusions: Structural factors discriminant function suggest that more attention is paid to training of speed and endurance, the more likely to increase the force to prepare the boys. The canonical discriminant function can  be used to assess and forecast the development of motor skills in boys.

  9. This research is to study the factors which influence the business success of small business ‘processed rotan’. The data employed in the study are primary data within the period of July to August 2013, 30 research observations through census method. Method of analysis used in the study is multiple linear regressions. The results of analysis showed that the factors of labor, innovation and promotion have positive and significant influence on the business success of small business ‘processed rotan’ simultaneously. The analysis also showed that partially labor has positive and significant influence on the business success, yet innovation and promotion have insignificant and positive influence on the business success.

    OpenAIRE

    Nasution, Inggrita Gusti Sari; Muchtar, Yasmin Chairunnisa

    2013-01-01

    This research is to study the factors which influence the business success of small business ‘processed rotan’. The data employed in the study are primary data within the period of July to August 2013, 30 research observations through census method. Method of analysis used in the study is multiple linear regressions. The results of analysis showed that the factors of labor, innovation and promotion have positive and significant influence on the business success of small busine...

  10. Confirmatory Factor Analysis of the ISB - Burnout Syndrome Inventory

    Directory of Open Access Journals (Sweden)

    Ana Maria T. Benevides-Pereira

    2017-05-01

    Full Text Available AimBurnout is a dysfunctional reaction to chronic occupational stress. The present study analysis the psychometric qualities of the Burnout Syndrome Inventory (ISB through Confirmatory Factor Analysis (CFA.MethodEmpirical study in a multi-centre and multi-occupational sample (n = 701 using the ISB. The Part I assesses antecedent factors: Positive Organizational Conditions (PC and Negative Organizational Conditions (NC. The Part II assesses the syndrome: Emotional Exhaustion (EE, Dehumanization (DE, Emotional Distancing (ED and Personal Accomplishment (PA.ResultsThe highest means occurred in the positive scales CP (M = 23.29, SD = 5.89 and PA (M = 14.84, SD = 4.71. Negative conditions showed the greatest variability (SD = 6.03. Reliability indexes were reasonable, with the lowest rate at .77 for DE and the highest rate .91 for PA. The CFA revealed RMSEA = .057 and CFI = .90 with all scales regressions showing significant values (β = .73 until β = .92.ConclusionThe ISB showed a plausible instrument to evaluate burnout. The two sectors maintained the initial model and confirmed the theoretical presupposition. This instrument makes possible a more comprehensive idea of the labour context, and one or another part may be used separately according to the needs and the aims of the assessor.

  11. Meta-analysis of the risk factor for endophthalmitis in patients after cataract surgery

    Directory of Open Access Journals (Sweden)

    Fei Wen

    2016-07-01

    Full Text Available AIM: To explore the main risk factors related to the incidence of endophthalmitis in patients after cataract surgery in China and to provide evidence for prevention. METHODS: The results of 5 studies on the main risk factors of endophthalmitis in patients after cataract surgery were analyzed by Meta-analysis method. RESULTS: The pooled odds ratio values and 95% CI of age(≥70, diabetes, vitreous overflow, operative time(≥10min, common operating room and control of using time of topical anesthetic were 1.81(95% CI: 1.43-1.69,3.66(95% CI: 1.64-8.16,2.21(95% CI: 1.46-3.32,3.54(95% CI: 2.47-5.06,2.77(95% CI: 2.07-3.72,2.09(95% CI: 1.53-2.86. CONCLUSION: The main risk factors of endophthalmitis were the age(≥70, diabetes, vitreous overflow, operative time(≥10min, common operating room and control of using time of topical anesthetic.

  12. Comparison of Two- and Three-Dimensional Methods for Analysis of Trunk Kinematic Variables in the Golf Swing.

    Science.gov (United States)

    Smith, Aimée C; Roberts, Jonathan R; Wallace, Eric S; Kong, Pui; Forrester, Stephanie E

    2016-02-01

    Two-dimensional methods have been used to compute trunk kinematic variables (flexion/extension, lateral bend, axial rotation) and X-factor (difference in axial rotation between trunk and pelvis) during the golf swing. Recent X-factor studies advocated three-dimensional (3D) analysis due to the errors associated with two-dimensional (2D) methods, but this has not been investigated for all trunk kinematic variables. The purpose of this study was to compare trunk kinematic variables and X-factor calculated by 2D and 3D methods to examine how different approaches influenced their profiles during the swing. Trunk kinematic variables and X-factor were calculated for golfers from vectors projected onto the global laboratory planes and from 3D segment angles. Trunk kinematic variable profiles were similar in shape; however, there were statistically significant differences in trunk flexion (-6.5 ± 3.6°) at top of backswing and trunk right-side lateral bend (8.7 ± 2.9°) at impact. Differences between 2D and 3D X-factor (approximately 16°) could largely be explained by projection errors introduced to the 2D analysis through flexion and lateral bend of the trunk and pelvis segments. The results support the need to use a 3D method for kinematic data calculation to accurately analyze the golf swing.

  13. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  14. Confirmatory Factor Analysis of the Structure of Statistics Anxiety Measure: An Examination of Four Alternative Models

    Directory of Open Access Journals (Sweden)

    Hossein Bevrani, PhD

    2011-09-01

    Full Text Available Objective: The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Statistics Anxiety Measure (SAM, proposed by Earp.Method: The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. Confirmatory factor analysis (CFA was carried out to determine the factor structures of the Persian adaptation of SAM.Results: As expected, the second order model provided a better fit to the data than the three alternative models. Conclusions: Hence, SAM provides an equally valid measure for use among college students. The study both expands and adds support to the existing body of math anxiety literature.

  15. Analysis of stationary availability factor of two-level backbone computer networks with arbitrary topology

    Science.gov (United States)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.

  16. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  17. Application of FEM analysis methods to a cylinder-cylinder intersection structure

    International Nuclear Information System (INIS)

    Xue Liping; Widera, G.E.O.; Sang Zhifu

    2005-01-01

    The objective of this paper is to study a particular cylindrical shell intersection (d/D=0.526) by use of both linear elastic and elastic-plastic stress analyses via the finite element method using the FEA software ANSYS. The former mainly focuses on the calculation of the stress concentration and flexibility factors in the intersection area before the structure experiences plastic behavior. When an elastic-plastic analysis method is employed, the limit load and burst pressure need to be determined. In this study, two different methods, the 'double elastic-slope method' and the 'tangent intersection method' are both employed to determine the limit pressure. To predict the burst pressure and failure location, the 'arc-length method' in ANSYS is used to solve the nonlinear problem. Finally, the FEA results are compared to experimental data and the agreement is shown to be good. (authors)

  18. Radiation analysis devices, radiation analysis methods, and articles of manufacture

    Science.gov (United States)

    Roybal, Lyle Gene

    2010-06-08

    Radiation analysis devices include circuitry configured to determine respective radiation count data for a plurality of sections of an area of interest and combine the radiation count data of individual of sections to determine whether a selected radioactive material is present in the area of interest. An amount of the radiation count data for an individual section is insufficient to determine whether the selected radioactive material is present in the individual section. An article of manufacture includes media comprising programming configured to cause processing circuitry to perform processing comprising determining one or more correction factors based on a calibration of a radiation analysis device, measuring radiation received by the radiation analysis device using the one or more correction factors, and presenting information relating to an amount of radiation measured by the radiation analysis device having one of a plurality of specified radiation energy levels of a range of interest.

  19. Feasibility analysis of EDXRF method to detect heavy metal pollution in ecological environment

    Science.gov (United States)

    Hao, Zhixu; Qin, Xulei

    2018-02-01

    The change of heavy metal content in water environment, soil and plant can reflect the change of heavy metal pollution in ecological environment, and it is important to monitor the trend of heavy metal pollution in eco-environment by using water environment, soil and heavy metal content in plant. However, the content of heavy metals in nature is very low, the background elements of water environment, soil and plant samples are complex, and there are many interfering factors in the EDXRF system that will affect the spectral analysis results and reduce the detection accuracy. Through the contrastive analysis of several heavy metal elements detection methods, it is concluded that the EDXRF method is superior to other chemical methods in testing accuracy and method feasibility when the heavy metal pollution in soil is tested in ecological environment.

  20. A Rasch and factor analysis of the Functional Assessment of Cancer Therapy-General (FACT-G

    Directory of Open Access Journals (Sweden)

    Selby Peter J

    2007-04-01

    Full Text Available Abstract Background Although the Functional Assessment of Cancer Therapy – General questionnaire (FACT-G has been validated few studies have explored the factor structure of the instrument, in particular using non-sample dependent measurement techniques, such as Rasch Models. Furthermore, few studies have explored the relationship between item fit to the Rasch Model and clinical utility. The aim of this study was to investigate the dimensionality and measurement properties of the FACT-G with Rasch Models and Factor analysis. Methods A factor analysis and Rasch analysis (Partial Credit Model was carried out on the FACT-G completed by a heterogeneous sample of cancer patients (n = 465. For the Rasch analysis item fit (infit mean squares ≥ 1.30, dimensionality and item invariance were assessed. The impact of removing misfitting items on the clinical utility of the subscales and FACT-G total scale was also assessed. Results The factor analysis demonstrated a four factor structure of the FACT-G which broadly corresponded to the four subscales of the instrument. Internal consistency for these four scales was very good (Cronbach's alpha 0.72 – 0.85. The Rasch analysis demonstrated that each of the subscales and the FACT-G total scale had misfitting items (infit means square ≥ 1.30. All these scales with the exception of the Social & Family Well-being Scale (SFWB were unidimensional. When misfitting items were removed, the effect sizes and the clinical utility of the instrument were maintained for the subscales and the total FACT-G scores. Conclusion The results of the traditional factor analysis and Rasch analysis of the FACT-G broadly agreed. Caution should be exercised when utilising the Social & Family Well-being scale and further work is required to determine whether this scale is best represented by two factors. Additionally, removing misfitting items from scales should be performed alongside an assessment of the impact on clinical utility.

  1. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  2. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    Jaklevic, J.M.

    1976-01-01

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  3. Resonance self-shielding method using resonance interference factor library for practical lattice physics computations of LWRs

    International Nuclear Information System (INIS)

    Choi, Sooyoung; Khassenov, Azamat; Lee, Deokjung

    2016-01-01

    This paper presents a new method of resonance interference effect treatment using resonance interference factor for high fidelity analysis of light water reactors (LWRs). Although there have been significant improvements in the lattice physics calculations over the several decades, there exist still relatively large errors in the resonance interference treatment, in the order of ∼300 pcm in the reactivity prediction of LWRs. In the newly developed method, the impact of resonance interference to the multi-group cross-sections has been quantified and tabulated in a library which can be used in lattice physics calculation as adjustment factors of multi-group cross-sections. The verification of the new method has been performed with Mosteller benchmark, UO_2 and MOX pin-cell depletion problems, and a 17×17 fuel assembly loaded with gadolinia burnable poison, and significant improvements were demonstrated in the accuracy of reactivity and pin power predictions, with reactivity errors down to the order of ∼100 pcm. (author)

  4. Noise removal using factor analysis of dynamic structures: application to cardiac gated studies.

    Science.gov (United States)

    Bruyant, P P; Sau, J; Mallet, J J

    1999-10-01

    Factor analysis of dynamic structures (FADS) facilitates the extraction of relevant data, usually with physiologic meaning, from a dynamic set of images. The result of this process is a set of factor images and curves plus some residual activity. The set of factor images and curves can be used to retrieve the original data with reduced noise using an inverse factor analysis process (iFADS). This improvement in image quality is expected because the inverse process does not use the residual activity, assumed to be made of noise. The goal of this work is to quantitate and assess the efficiency of this method on gated cardiac images. A computer simulation of a planar cardiac gated study was performed. The simulated images were added with noise and processed by the FADS-iFADS program. The signal-to-noise ratios (SNRs) were compared between original and processed data. Planar gated cardiac studies from 10 patients were tested. The data processed by FADS-iFADS were subtracted to the original data. The result of the substraction was studied to evaluate its noisy nature. The SNR is about five times greater after the FADS-iFADS process. The difference between original and processed data is noise only, i.e., processed data equals original data minus some white noise. The FADS-iFADS process is successful in the removal of an important part of the noise and therefore is a tool to improve the image quality of cardiac images. This tool does not decrease the spatial resolution (compared with smoothing filters) and does not lose details (compared with frequential filters). Once the number of factors is chosen, this method is not operator dependent.

  5. Economic cost analysis of service quality as a factor of sustainable development of enterprises

    Directory of Open Access Journals (Sweden)

    Gritsenko Olena Ivanivna

    2015-02-01

    Full Text Available In the article possibilities of economic analysis are considered in the context of improvement of quality of service and effective realization of the charges related to him. The analysis of existent principles of theory of cognition is conducted, that directly related to the economic analysis of quality. The basic factors of forming and improvement of functioning of quality of service for providing of competitiveness subjects of ménage are considered at the market of commodities in the state and after his limits. There were determined the main factors, which affect the sum and level of spending on quality of service. It is well-proven that charges on quality of service are a difficult economic category, and the mainly existent methods of account and accounting do not allow directly and exactly to select such charges of enterprise structures. It is for this purpose necessary to conduct the concrete and detailed (empiric analysis of structure of charges and its elements.

  6. Analysis and optimization of the TWINKLE factoring device

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Preneel, B.

    2000-01-01

    We describe an enhanced version of the TWINKLE factoring device and analyse to what extent it can be expected to speed up the sieving step of the Quadratic Sieve and Number Field Sieve factoring al- gorithms. The bottom line of our analysis is that the TWINKLE-assisted factorization of 768-bit

  7. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  8. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  9. Identifying Critical Factors in the Eco-Efficiency of Remanufacturing Based on the Fuzzy DEMATEL Method

    Directory of Open Access Journals (Sweden)

    Qianwang Deng

    2015-11-01

    Full Text Available Remanufacturing can bring considerable economic and environmental benefits such as cost saving, conservation of energy and resources, and reduction of emissions. With the increasing awareness of sustainable manufacturing, remanufacturing gradually becomes the research priority. Most studies concentrate on the analysis of influencing factors, or the evaluation of the economic and environmental performance in remanufacturing, while little effort has been devoted to investigating the critical factors influencing the eco-efficiency of remanufacturing. Considering the current development of the remanufacturing industry in China, this paper proposes a set of factors influencing the eco-efficiency of remanufacturing and then utilizes a fuzzy Decision Making Trial and Evaluation Laboratory (DEMATEL method to establish relation matrixes reflecting the interdependent relationships among these factors. Finally, the contributions of each factor to eco-efficiency and mutual influence values among them are obtained, and critical factors in eco-efficiency of remanufacturing are identified. The results of the present work can provide theoretical supports for the government to make appropriate policies to improve the eco-efficiency of remanufacturing.

  10. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  11. A Novel Double Cluster and Principal Component Analysis-Based Optimization Method for the Orbit Design of Earth Observation Satellites

    Directory of Open Access Journals (Sweden)

    Yunfeng Dong

    2017-01-01

    Full Text Available The weighted sum and genetic algorithm-based hybrid method (WSGA-based HM, which has been applied to multiobjective orbit optimizations, is negatively influenced by human factors through the artificial choice of the weight coefficients in weighted sum method and the slow convergence of GA. To address these two problems, a cluster and principal component analysis-based optimization method (CPC-based OM is proposed, in which many candidate orbits are gradually randomly generated until the optimal orbit is obtained using a data mining method, that is, cluster analysis based on principal components. Then, the second cluster analysis of the orbital elements is introduced into CPC-based OM to improve the convergence, developing a novel double cluster and principal component analysis-based optimization method (DCPC-based OM. In DCPC-based OM, the cluster analysis based on principal components has the advantage of reducing the human influences, and the cluster analysis based on six orbital elements can reduce the search space to effectively accelerate convergence. The test results from a multiobjective numerical benchmark function and the orbit design results of an Earth observation satellite show that DCPC-based OM converges more efficiently than WSGA-based HM. And DCPC-based OM, to some degree, reduces the influence of human factors presented in WSGA-based HM.

  12. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  13. Rosenberg's Self-Esteem Scale: Two Factors or Method Effects.

    Science.gov (United States)

    Tomas, Jose M.; Oliver, Amparo

    1999-01-01

    Results of a study with 640 Spanish high school students suggest the existence of a global self-esteem factor underlying responses to Rosenberg's (M. Rosenberg, 1965) Self-Esteem Scale, although the inclusion of method effects is needed to achieve a good model fit. Method effects are associated with item wording. (SLD)

  14. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  15. On the use of the hybrid causal logic method in offshore risk analysis

    International Nuclear Information System (INIS)

    Roed, Willy; Mosleh, Ali; Vinnem, Jan Erik; Aven, Terje

    2009-01-01

    In the Norwegian offshore oil and gas industry risk analyses have been used to provide decision support for more than 20 years. The focus has traditionally been on the planning phase, but during the last years a need for better risk analysis methods for the operational phase has been identified. Such methods should take human and organizational factors into consideration in a more explicit way than the traditional risk analysis methods do. Recently, a framework, called hybrid causal logic (HCL), has been developed based on traditional risk analysis tools combined with Bayesian belief networks (BBNs), using the aviation industry as a case. This paper reviews this framework and discusses its applicability for the offshore industry, and the relationship to existing research projects, such as the barrier and operational risk analysis project (BORA). The paper also addresses specific features of the framework and suggests a new approach for the probability assignment process. This approach simplifies the assignment process considerably without loosing the flexibility that is needed to properly reflect the phenomena being studied

  16. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  17. Identification of noise in linear data sets by factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, Ph.K.

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors. (author)

  18. Parametric study on single shot peening by dimensional analysis method incorporated with finite element method

    Science.gov (United States)

    Wu, Xian-Qian; Wang, Xi; Wei, Yan-Peng; Song, Hong-Wei; Huang, Chen-Guang

    2012-06-01

    Shot peening is a widely used surface treatment method by generating compressive residual stress near the surface of metallic materials to increase fatigue life and resistance to corrosion fatigue, cracking, etc. Compressive residual stress and dent profile are important factors to evaluate the effectiveness of shot peening process. In this paper, the influence of dimensionless parameters on maximum compressive residual stress and maximum depth of the dent were investigated. Firstly, dimensionless relations of processing parameters that affect the maximum compressive residual stress and the maximum depth of the dent were deduced by dimensional analysis method. Secondly, the influence of each dimensionless parameter on dimensionless variables was investigated by the finite element method. Furthermore, related empirical formulas were given for each dimensionless parameter based on the simulation results. Finally, comparison was made and good agreement was found between the simulation results and the empirical formula, which shows that a useful approach is provided in this paper for analyzing the influence of each individual parameter.

  19. The Dematel Method in the Analysis of the Residential Real Estate Market in Bialystok

    Directory of Open Access Journals (Sweden)

    Gołąbeska Elżbieta

    2018-03-01

    Full Text Available The article is of a dual character. On the one hand, it concerns the analysis of the real estate market in terms of investment attractiveness and the influence of particular attributes of property on attractiveness itself. On the other hand, it proposes the use as a research tool selected using one of the multicriteria methods (DEMATEL, which is an alternative to commonly used statistical methods. The theoretical part of the work is completed by a brief computational example which deals with the analysis of factors affecting the prices of residential real estate on a local scale.

  20. Behavioral determinants of cardiovascular diseases risk factors: A qualitative directed content analysis.

    Science.gov (United States)

    Sabzmakan, Leila; Morowatisharifabad, Mohammad Ali; Mohammadi, Eesa; Mazloomy-Mahmoodabad, Seid Saied; Rabiei, Katayoun; Naseri, Mohammad Hassan; Shakibazadeh, Elham; Mirzaei, Masoud

    2014-03-01

    The PRECEDE model is a useful tool for planers to assess health problems, the behavioral and environmental causes of the problems, and their determinants. This study aims to understand the experiences of patients and health care providers about the behavioral causes of cardiovascular diseases (CVDs) risk factors and their determinants. This qualitative study utilized content analysis approach based on the PRECEDE model. The study was conducted for over 6 months in 2012 at the diabetes units of health centers associated with Alborz University of Medical Sciences, which is located in Karaj, Iran. Data were collected using individual semi-structured interviews with 50 patients and 12 health care providers. Data analysis was performed simultaneously with data collection using the content analysis directed method. Stress, unhealthy eating, and physical inactivity were the behaviors, which predict the risk factors for CVD. Most of the patients considered stress as the most important underlying cause of their illness. In this study, 110 of the primary codes were categorized into seven subcategories, including knowledge, attitude, perceived susceptibility, severity, perceived benefits, barriers, and self-efficacy, which were located in the predisposing category of the PRECEDE model. Among these determinants, perceived barriers and self-efficacy for the mentioned behaviors seemed to be of great importance. Identifying behavioral determinants will help the planners design future programs and select the most appropriate methods and applications to address these determinants in order to reduce risky behaviors.

  1. In depth analysis of motivational factors at work in the health industry

    Directory of Open Access Journals (Sweden)

    Sukhminder Jit Singh Bajwa

    2010-01-01

    Full Text Available Background: Motivation of health workers is necessary to generate the organizational commitment towards the patients and the hospital and therefore the knowledge about what motivates and satisfies them is very essential.The aim of the project was to investigate and analyze the various factors that help in motivation of the health workers while performing their clinical duties in the hospital. Materials and Methods: A simple random study was conducted among 100 employees of our institute, which included doctors, staff nurses and paramedical staff. One hundred employees from Gian Sagar Institute were chosen randomly for the purpose of our study. All the employees were enquired by the questionnaire method as well as by individual interviews regarding the various motivating and demotivating factors at the work place. Detailed enquiries were performed regarding the various aspects concerning the job factors and work satisfaction. All the answers and findings were observed and recorded. Results: Statistical Analysis Used: Simple non-parametric tests like mean, percentages and chi square tests were employed to analyze the data.The demographic profile of all the employees showed only minor differences which were statistically non-significant. Skills, task identity, task significance, autonomy, feedback, environment, job security and compensation were observed to be the important factors for the motivation of employees. The depth and the extent to which these factors were studied at work in the hospital showed remarkable differences. Conclusion: All the factors studied in this project are essential basis for organizational commitment, but feedback represents the factor with the highest motivation potential especially among the younger population.

  2. Analysis of Traffic Crashes Involving Pedestrians Using Big Data: Investigation of Contributing Factors and Identification of Hotspots.

    Science.gov (United States)

    Xie, Kun; Ozbay, Kaan; Kurkcu, Abdullah; Yang, Hong

    2017-08-01

    This study aims to explore the potential of using big data in advancing the pedestrian risk analysis including the investigation of contributing factors and the hotspot identification. Massive amounts of data of Manhattan from a variety of sources were collected, integrated, and processed, including taxi trips, subway turnstile counts, traffic volumes, road network, land use, sociodemographic, and social media data. The whole study area was uniformly split into grid cells as the basic geographical units of analysis. The cell-structured framework makes it easy to incorporate rich and diversified data into risk analysis. The cost of each crash, weighted by injury severity, was assigned to the cells based on the relative distance to the crash site using a kernel density function. A tobit model was developed to relate grid-cell-specific contributing factors to crash costs that are left-censored at zero. The potential for safety improvement (PSI) that could be obtained by using the actual crash cost minus the cost of "similar" sites estimated by the tobit model was used as a measure to identify and rank pedestrian crash hotspots. The proposed hotspot identification method takes into account two important factors that are generally ignored, i.e., injury severity and effects of exposure indicators. Big data, on the one hand, enable more precise estimation of the effects of risk factors by providing richer data for modeling, and on the other hand, enable large-scale hotspot identification with higher resolution than conventional methods based on census tracts or traffic analysis zones. © 2017 Society for Risk Analysis.

  3. Factor analysis for the adoption of nuclear technology in diagnosis and treatment of chronic diseases

    International Nuclear Information System (INIS)

    Sato, Renato Cesar; Zouain, Desiree Moraes

    2012-01-01

    To identify and evaluate latent variables (variables that are not directly observed) for adopting and using nuclear technologies in diagnosis and treatment of chronic diseases. The measurement and management of these latent factors are important for health care due to complexities of the sector. Methods: An exploratory factor analysis study was conducted among 52 physicians practicing in the areas of Cardiology, Neurology and Oncology in the State of Sao Paulo who agreed to participate in the study between 2009 and 2010. Data were collected using an attitude measurement questionnaire, and analyzed according to the principal component method with Varimax rotation. Results: The component matrix after factor rotation showed three elucidative groups arranged according to demand for nuclear technology: clinical factors, structural factors, and technological factors. Clinical factors included questionnaire answers referring to medical history, previous interventions, complexity and chronicity of the disease. Structural factors included patient age, physician's practice area, and payment ability. Technological factors included prospective growth in the use of nuclear technology and availability of services. Conclusions: The clinical factors group dimension identified in the study included patient history, prior interventions, and complexity and chronicity of the disease. This dimension is the main motivating for adopting nuclear technology in diagnosis and treatment of chronic diseases. (author)

  4. "Factor Analysis Using ""R"""

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2013-02-01

    Full Text Available R (R Development Core Team, 2011 is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free and flexibility (its open-source. This article gives a general introduction to using R (i.e., loading the program, using functions, importing data. Then, using data from Canivez, Konold, Collins, and Wilson (2009, this article walks the user through how to use the program to conduct factor analysis, from both an exploratory and confirmatory approach.

  5. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  6. Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters

    Science.gov (United States)

    Kim, T.; Kim, Y. S.

    2017-12-01

    The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).

  7. Dysmenorrhea Characteristics of Female Students of Health School and Affecting Factors and Their Knowledge and Use of Complementary and Alternative Medicine Methods.

    Science.gov (United States)

    Midilli, Tulay Sagkal; Yasar, Eda; Baysal, Ebru

    2015-01-01

    The purpose of this study was to examine the menstruation and dysmenorrhea characteristics and the factors affecting dysmenorrhea of health school students, and the knowledge and use of the methods of complementary and alternative medicine (CAM) on the part of those students with dysmenorrhea. This is a descriptive study. A descriptive analysis was made by calculating the number, percentage, mean, Pearson χ, and logistic regression analysis. A total of 488 female students participated in the research and 87.7% (n = 428) of all students experienced dysmenorrhea. It was detected that a family history of dysmenorrhea and regular menstrual cycles of the students were dysmenorrhea-affecting factors (P dysmenorrhea used CAM methods. Heat application of CAM methods for dysmenorrhea management was the most commonly used and also known by the students. The students who experienced severe pain used analgesics (P < .05) and CAM methods (P < .05).

  8. A load factor based mean-variance analysis for fuel diversification

    Energy Technology Data Exchange (ETDEWEB)

    Gotham, Douglas; Preckel, Paul; Ruangpattana, Suriya [State Utility Forecasting Group, Purdue University, West Lafayette, IN (United States); Muthuraman, Kumar [McCombs School of Business, University of Texas, Austin, TX (United States); Rardin, Ronald [Department of Industrial Engineering, University of Arkansas, Fayetteville, AR (United States)

    2009-03-15

    Fuel diversification implies the selection of a mix of generation technologies for long-term electricity generation. The goal is to strike a good balance between reduced costs and reduced risk. The method of analysis that has been advocated and adopted for such studies is the mean-variance portfolio analysis pioneered by Markowitz (Markowitz, H., 1952. Portfolio selection. Journal of Finance 7(1) 77-91). However the standard mean-variance methodology, does not account for the ability of various fuels/technologies to adapt to varying loads. Such analysis often provides results that are easily dismissed by regulators and practitioners as unacceptable, since load cycles play critical roles in fuel selection. To account for such issues and still retain the convenience and elegance of the mean-variance approach, we propose a variant of the mean-variance analysis using the decomposition of the load into various types and utilizing the load factors of each load type. We also illustrate the approach using data for the state of Indiana and demonstrate the ability of the model in providing useful insights. (author)

  9. Risk factors for severity of pneumothorax after CT-guided percutaneous lung biopsy using the single-needle method.

    Science.gov (United States)

    Kakizawa, Hideaki; Toyota, Naoyuki; Hieda, Masashi; Hirai, Nobuhiko; Tachikake, Toshihiro; Matsuura, Noriaki; Oda, Miyo; Ito, Katsuhide

    2010-09-01

    The purpose of this study is to evaluate the risk factors for the severity of pneumothorax after computed tomography (CT)-guided percutaneous lung biopsy using the single-needle method. We reviewed 91 biopsy procedures for 90 intrapulmonary lesions in 89 patients. Patient factors were age, sex, history of ipsilateral lung surgery and grade of emphysema. Lesion factors were size, location and pleural contact. Procedure factors were position, needle type, needle size, number of pleural punctures, pleural angle, length of needle passes in the aerated lung and number of harvesting samples. The severity of pneumothorax after biopsy was classified into 4 groups: "none", "mild", "moderate" and "severe". The risk factors for the severity of pneumothorax were determined by multivariate analyzing of the factors derived from univariate analysis. Pneumothorax occurred in 39 (43%) of the 91 procedures. Mild, moderate, and severe pneumothorax occurred in 24 (26%), 8 (9%) and 7 (8%) of all procedures, respectively. Multivariate analysis showed that location, pleural contact, number of pleural punctures and number of harvesting samples were significantly associated with the severity of pneumothorax (p < 0.05). In conclusion, lower locations and non-pleural contact lesions, increased number of pleural punctures and increased number of harvesting samples presented a higher severity of pneumothorax.

  10. Risk factors for severity of pneumothorax after CT-guided percutaneous lung biopsy using the single-needle method

    International Nuclear Information System (INIS)

    Kakizawa, Hideaki; Hieda, Masashi; Oda, Miyo; Toyota, Naoyuki; Hirai, Nobuhiko; Tachikake, Toshihiro; Matsuura, Noriaki; Ito, Katsuhide

    2010-01-01

    The purpose of this study is to evaluate the risk factors for the severity of pneumothorax after computed tomography (CT)-guided percutaneous lung biopsy using the single-needle method. We reviewed 91 biopsy procedures for 90 intrapulmonary lesions in 89 patients. Patient factors were age, sex, history of ipsilateral lung surgery and grade of emphysema. Lesion factors were size, location and pleural contact. Procedure factors were position, needle type, needle size, number of pleural punctures, pleural angle, length of needle passes in the aerated lung and number of harvesting samples. The severity of pneumothorax after biopsy was classified into 4 groups: 'none', 'mild', 'moderate' and 'severe'. The risk factors for the severity of pneumothorax were determined by multivariate analyzing of the factors derived from univariate analysis. Pneumothorax occurred in 39 (43%) of the 91 procedures. Mild, moderate, and severe pneumothorax occurred in 24 (26%), 8 (9%) and 7 (8%) of all procedures, respectively. Multivariate analysis showed that location, pleural contact, number of pleural punctures and number of harvesting samples were significantly associated with the severity of pneumothorax (p<0.05). In conclusion, lower locations and non-pleural contact lesions, increased number of pleural punctures and increased number of harvesting samples presented a higher severity of pneumothorax. (author)

  11. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  12. An analytically based numerical method for computing view factors in real urban environments

    Science.gov (United States)

    Lee, Doo-Il; Woo, Ju-Wan; Lee, Sang-Hyun

    2018-01-01

    A view factor is an important morphological parameter used in parameterizing in-canyon radiative energy exchange process as well as in characterizing local climate over urban environments. For realistic representation of the in-canyon radiative processes, a complete set of view factors at the horizontal and vertical surfaces of urban facets is required. Various analytical and numerical methods have been suggested to determine the view factors for urban environments, but most of the methods provide only sky-view factor at the ground level of a specific location or assume simplified morphology of complex urban environments. In this study, a numerical method that can determine the sky-view factors ( ψ ga and ψ wa ) and wall-view factors ( ψ gw and ψ ww ) at the horizontal and vertical surfaces is presented for application to real urban morphology, which are derived from an analytical formulation of the view factor between two blackbody surfaces of arbitrary geometry. The established numerical method is validated against the analytical sky-view factor estimation for ideal street canyon geometries, showing a consolidate confidence in accuracy with errors of less than 0.2 %. Using a three-dimensional building database, the numerical method is also demonstrated to be applicable in determining the sky-view factors at the horizontal (roofs and roads) and vertical (walls) surfaces in real urban environments. The results suggest that the analytically based numerical method can be used for the radiative process parameterization of urban numerical models as well as for the characterization of local urban climate.

  13. Human factor analysis and preventive countermeasures in nuclear power plant

    International Nuclear Information System (INIS)

    Li Ye

    2010-01-01

    Based on the human error analysis theory and the characteristics of maintenance in a nuclear power plant, human factors of maintenance in NPP are divided into three different areas: human, technology, and organization. Which is defined as individual factors, including psychological factors, physiological characteristics, health status, level of knowledge and interpersonal skills; The technical factors including technology, equipment, tools, working order, etc.; The organizational factors including management, information exchange, education, working environment, team building and leadership management,etc The analysis found that organizational factors can directly or indirectly affect the behavior of staff and technical factors, is the most basic human error factor. Based on this nuclear power plant to reduce human error and measures the response. (authors)

  14. Advanced method used for hypertension’s risk factors stratification: support ‎vector machines and gravitational search algorithm

    Directory of Open Access Journals (Sweden)

    Alireza Khosravi

    2015-12-01

    Full Text Available BACKGROUND: The aim of this study is to present an objective method based on support vector machines (SVMs and gravitational search algorithm (GSA which is initially utilized for recognition the pattern among risk factors and hypertension (HTN to stratify and analysis HTN’s risk factors in an Iranian urban population. METHODS: This community-based and cross-sectional research has been designed based on the probabilistic sample of residents of Isfahan, Iran, aged 19 years or over from 2001 to 2007. One of the household members was randomly selected from different age groups. Selected individuals were invited to a predefined health center to be educated on how to collect 24-hour urine sample as well as learning about topographic parameters and blood pressure measurement. The data from both the estimated and measured blood pressure [for both systolic blood pressure (SBP and diastolic blood pressure (DBP] demonstrated that optimized SVMs have a highest estimation potential. RESULTS: This result was particularly more evident when SVMs performance is evaluated with regression and generalized linear modeling (GLM as common methods. Blood pressure risk factors impact analysis shows that age has the highest impact level on SBP while it falls second on the impact level ranking on DBP. The results also showed that body mass index (BMI falls first on the impact level ranking on DBP while have a lower impact on SBP. CONCLUSION: Our analysis suggests that salt intake could efficiently influence both DBP and SBP with greater impact level on SBP. Therefore, controlling salt intake may lead to not only control of HTN but also its prevention.

  15. Comprehensive analysis of the related factors of early hypothyroidism occurring in patients with Graves' disease after 131I treatment

    International Nuclear Information System (INIS)

    Tan Jian; Wang Peng; Zhang Lijuan; He Yajing; Wang Renfei

    2005-01-01

    Objective: To make a comprehensive analysis of the related factors of early hypothyroidism occurring in patients with Graves' disease after 131 I treatment. Methods: The information of 131 I treated Graves' disease was collected including general data, clinical observation, laboratory data, thyroid function test, etc. Then a retrospective statistical analysis was carried out, using cluster analysis, factor analysis, discriminant analysis, multivariate regression analysis, etc. Results: 1) Cluster analysis and factor analysis showed that among clinical observation such as clinical course, treatment course, patients' state and disease occurrance, the first three factors correlated highly; among laboratory data such as thyrotrophin receptor antibody (TRAb), thyroid-stimulating immunoglobulins (TSI), thyroglobulin antibody (TgAb) and thyroid microsomal antibody (TMAb), both the first two and the last two correlated highly, each two factors had the similar effect. 2) Fsher discriminant analysis showed that among the thyroid weight, the effective half life, the maximum 131 I uptake percentage, total dose of 131 I and the average dose of 131 I per gram of thyroid, the last one had the most predicting value for incidence of early hypothyroidism. 3) Logistic regression analysis showed that among all the related factors of early hypothyroidism occurred after 131 I treated Graves' disease, thyroid weight, average dose of 131 I per gram of thyroid, the maximum 131 I uptake percentage and the level of TSI were effective factors. Conclusions: The occurrence of early hypothyroidism for 131 I-treated Graves' disease is probably affected by many factors. If more factors are taken into consideration before therapy and the theraputic dose is well adjusted accordingly, it can reduce the incidence of early hypothroidism to a certain extent. (authors)

  16. Log Linear Models for Religious and Social Factors affecting the practice of Family Planning Methods in Lahore, Pakistan

    Directory of Open Access Journals (Sweden)

    Farooq Ahmad

    2006-01-01

    Full Text Available This is cross sectional study based on 304 households (couples with wives age less than 48 years, chosen from urban locality (city Lahore. Fourteen religious, demographic and socio-economic factors of categorical nature like husband education, wife education, husband’s monthly income, occupation of husband, household size, husband-wife discussion, number of living children, desire for more children, duration of marriage, present age of wife, age of wife at marriage, offering of prayers, political view, and religiously decisions were taken to understand acceptance of family planning. Multivariate log-linear analysis was applied to identify association pattern and interrelationship among factors. The logit model was applied to explore the relationship between predictor factors and dependent factor, and to explore which are the factors upon which acceptance of family planning is highly depending. Log-linear analysis demonstrate that preference of contraceptive use was found to be consistently associated with factors Husband-Wife discussion, Desire for more children, No. of children, Political view and Duration of married life. While Husband’s monthly income, Occupation of husband, Age of wife at marriage and Offering of prayers resulted in no statistical explanation of adoption of family planning methods.

  17. LISA data analysis using Markov chain Monte Carlo methods

    International Nuclear Information System (INIS)

    Cornish, Neil J.; Crowder, Jeff

    2005-01-01

    The Laser Interferometer Space Antenna (LISA) is expected to simultaneously detect many thousands of low-frequency gravitational wave signals. This presents a data analysis challenge that is very different to the one encountered in ground based gravitational wave astronomy. LISA data analysis requires the identification of individual signals from a data stream containing an unknown number of overlapping signals. Because of the signal overlaps, a global fit to all the signals has to be performed in order to avoid biasing the solution. However, performing such a global fit requires the exploration of an enormous parameter space with a dimension upwards of 50 000. Markov Chain Monte Carlo (MCMC) methods offer a very promising solution to the LISA data analysis problem. MCMC algorithms are able to efficiently explore large parameter spaces, simultaneously providing parameter estimates, error analysis, and even model selection. Here we present the first application of MCMC methods to simulated LISA data and demonstrate the great potential of the MCMC approach. Our implementation uses a generalized F-statistic to evaluate the likelihoods, and simulated annealing to speed convergence of the Markov chains. As a final step we supercool the chains to extract maximum likelihood estimates, and estimates of the Bayes factors for competing models. We find that the MCMC approach is able to correctly identify the number of signals present, extract the source parameters, and return error estimates consistent with Fisher information matrix predictions

  18. Meta-analysis of the Alpha/Beta Ratio for Prostate Cancer in the Presence of an Overall Time Factor

    DEFF Research Database (Denmark)

    Vogelius, Ivan R; Bentzen, Søren M

    2013-01-01

    PURPOSE: To present a novel method for meta-analysis of the fractionation sensitivity of tumors as applied to prostate cancer in the presence of an overall time factor. METHODS AND MATERIALS: A systematic search for radiation dose-fractionation trials in prostate cancer was performed using PubMed...

  19. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  20. Correlation Relationship of Performance Shaping Factors (PSFs) for Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bheka, M. Khumalo; Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-10-15

    At TMI-2, operators permitted thousands of gallons of water to escape from the reactor plant before realizing that the coolant pumps were behaving abnormally. The coolant pumps were then turned off, which in turn led to the destruction of the reactor itself as cooling was completely lost within the core. Human also plays a role in many aspects of complex systems e.g. in design and manufacture of hardware, interface between human and system and also in maintaining such systems as well as for coping with unusual events that place the NPP system at a risk. This is why human reliability analysis (HRA) - an aspect of risk assessments which systematically identifies and analyzes the causes and consequences of human decisions and actions - is important in nuclear power plant operations. It either upgrades or degrades human performance; therefore it has an impact on the possibility of error. These PSFs can be used in various HRA methods to estimate Human Error Probabilities (HEPs). There are many current HRA methods who propose sets of PSFs for normal operation mode of NPP. Some of these PSFs in the sets have some degree of dependency and overlap. Overlapping PSFs introduce error in HEP evaluations due to the fact that some elements are counted more than once in data; this skews the relationship amongst PSF and masks the way that the elements interact to affect performance. This study uses a causal model that represents dependencies and relationships amongst PSFs for HEP evaluation during normal NPP operational states. The model is built taking into consideration the dependencies among PSFs and thus eliminating overlap. The use of an interdependent model of PSFs is expected to produce more accurate HEPs compared to other current methods. PSF sets produced in this study can be further used as nodes (variables) and directed arcs (causal influence between nodes) in HEP evaluation methods such as Bayesian belief (BN) networks. This study was done to estimate the relationships

  1. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  2. An analysis of factors that influence the technical efficiency of Malaysian thermal power plants

    International Nuclear Information System (INIS)

    See, Kok Fong; Coelli, Tim

    2012-01-01

    The main objectives of this paper are to measure the technical efficiency levels of Malaysian thermal power plants and to investigate the degree to which various factors influence efficiency levels in these plants. Stochastic frontier analysis (SFA) methods are applied to plant-level data over an eight year period from 1998 to 2005. This is the first comprehensive analysis (to our knowledge) of technical efficiency in the Malaysian electricity generation industry using parametric method. Our empirical results indicate that ownership, plant size and fuel type have a significant influence on technical efficiency levels. We find that publicly-owned power plants obtain average technical efficiencies of 0.68, which is lower than privately-owned power plants, which achieve average technical efficiencies of 0.88. We also observe that larger power plants with more capacity and gas-fired power plants tend to be more technically efficient than other power plants. Finally, we find that plant age and peaking plant type have no statistically significant influence on the technical efficiencies of Malaysian thermal power plants. - Highlights: ► We examine the technical efficiency (TE) levels of Malaysian thermal power plants. ► We also investigate the degree to which various factors influence efficiency levels in these plants. ► Stochastic frontier analysis methods are used. ► Average plant would have to increase their TE level by 21% to reach the efficient frontier. ► Ownership, plant size and fuel type have a significant influence on the TE levels.

  3. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  4. Seismic analysis of structures of nuclear power plants by Lanczos mode superposition method

    International Nuclear Information System (INIS)

    Coutinho, A.L.G.A.; Alves, J.L.D.; Landau, L.; Lima, E.C.P. de; Ebecken, N.F.F.

    1986-01-01

    The Lanczos Mode Superposition Method is applied in the seismic analysis of nuclear power plants. The coordinate transformation matrix is generated by the Lanczos algorithm. It is shown that, through a convenient choice of the starting vector of the algorithm, modes with participation factors are automatically selected. It is performed the Response Spectra analysis of a typical reactor building. The obtained results are compared with those determined by the classical aproach stressing the remarkable computer effectiveness of the proposed methodology. (Author) [pt

  5. Analysis of risk factors of pulmonary embolism in diabetic patients

    International Nuclear Information System (INIS)

    Xie Changhui; Ma Zhihai; Zhu Lin; Chi Lianxiang

    2012-01-01

    Objective: To study the related risk factors in diabetic patients with pulmonary embolism (PE). Methods: 58 diabetic cases underwent lower limbs 99m Tc-MAA veins imaging (and/or ultrasonography) and pulmonary perfusion imaging. The related laboratory data [fasting blood glucose (FBG), blood cholesterol, blood long chain triglycerides (LCT)] and clinic information [age, disease courses, chest symptoms (chest pain and short of breathe), lower limbs symptoms (swelling, varicose veins and diabetic foot) and acute complication (diabetic ketoacidosis and hyperosmolar non ketotic diabetic coma)] were collected simultaneously. SPSS was used for χ 2 -test and Logistic regression analysis. Results: (1) 28 patients (48.3%) were showed to be with lower limbs deep vein thrombosis (DVT) and by 99m Tc-MAA imaging, 10 cases (17.2%) with PE. The PE ratios (32.1%) of the patients with DVT was more higher than no DVT (3.3%) (χ 2 =6.53, P 2 ≥4.23, P 2 ≤2.76, P>0.05), respectively. (3) Multiplicity analysis indicated: the related risk factors for PE included chest symptoms (Score=13.316, P=0.000) and lower limbs symptoms (Score=7.780, P=0.005). No significant difference to other factors (Score≤2.494, P>0.114), respectively. Conclusion: The serious DM with chest symptoms, lower limbs symptoms and/or DVT must be controlled as early as possible by all kinds of treatment. It will decrease the PE complication. (authors)

  6. A Sociological Analysis on the Effective factors in Tends to Hijab

    Directory of Open Access Journals (Sweden)

    Mahmood Sharepour

    2012-12-01

    Full Text Available From a sociological perspective. Hjjab is formed in the context of social relations in which the framework. women's issues with cultural. social. political. Economic and religious factors. spiritual. personality and behavior that is different from the paradigms and perspectives is worthy. The purpose of this study is social factors associated with the tendency of female students to wear veil. The statistical population of female students studying at the University make up the number in 1390 was equal to 13,000. 560 of whom have a multi-stage cluster sampling method was selected to the questionnaire with reliability 0.74 responded Results of multivariable regression analysis showed that the most Important variable influencing the direction and lends 10 veil has been the attitudes 10 feminism variable. Other variables affecting status, lifestyle and location. Analytical model explained only 33% of the factors affecting be as two conflicting tendencies. as a moderate 10 strong in university student.

  7. Correlation Factor Analysis of Retinal Microvascular Changes in Patients With Essential Hypertension

    Institute of Scientific and Technical Information of China (English)

    Huang Duru; Huang Zhongning

    2006-01-01

    Objectives To investigate correlation between retinal microvascular signs and essential hypertension classification. Methods The retinal microvascular signs in patients with essential hypertension were assessed with the indirect biomicroscopy lens, the direct and the indirect ophthalmoscopes were used to determine the hypertensive retinopathy grades and retinal arteriosclerosis grades.The rank correlation analysis was used to analysis the correlation these grades with the risk factors concerned with hypertension. Results Of 72 cases with essential hypertension, 28 cases complicated with coronary disease, 20 cases diabetes, 41 cases stroke,17 cases renal malfunction. Varying extent retinal arterioscleroses were found in 71 cases, 1 case with retinal hemorrhage, 2 cases with retina edema, 4 cases with retinal hard exudation, 5 cases with retinal hemorrhage complicated by hard exudation, 2 cases with retinal hemorrhage complicated by hard exudation and cotton wool spot, 1 case with retinal hemorrhage complicated by hard exudation and microaneurysms,1 case with retinal edema and hard exudation, 1 case with retinal microaneurysms, 1 case with branch retinal vein occlusion. The rank correlation analysis showed that either hypertensive retinopathy grades or retinal arteriosclerosis grades were correlated with risk factor lamination of hypertension (r=0.25 or 0.31, P<0.05), other correlation factors included age and blood high density lipoprotein concerned about hypertensive retinopathy grades or retinal arteriosclerosis grades, but other parameters, namely systolic or diastolic pressure, total cholesterol, triglyceride, low density lipoprotein cholesterol, fasting blood glucose,blood urea nitrogen and blood creatinine were not confirmed in this correlation analysis (P > 0.05).Conclusions Either hypertensive retinopathy grade or retinal arteriosclerosis grade is close with the hypertension risk factor lamination, suggesting that the fundus examination of patients with

  8. Characteristic Value Method of Well Test Analysis for Horizontal Gas Well

    Directory of Open Access Journals (Sweden)

    Xiao-Ping Li

    2014-01-01

    Full Text Available This paper presents a study of characteristic value method of well test analysis for horizontal gas well. Owing to the complicated seepage flow mechanism in horizontal gas well and the difficulty in the analysis of transient pressure test data, this paper establishes the mathematical models of well test analysis for horizontal gas well with different inner and outer boundary conditions. On the basis of obtaining the solutions of the mathematical models, several type curves are plotted with Stehfest inversion algorithm. For gas reservoir with closed outer boundary in vertical direction and infinite outer boundary in horizontal direction, while considering the effect of wellbore storage and skin effect, the pseudopressure behavior of the horizontal gas well can manifest four characteristic periods: pure wellbore storage period, early vertical radial flow period, early linear flow period, and late horizontal pseudoradial flow period. For gas reservoir with closed outer boundary both in vertical and horizontal directions, the pseudopressure behavior of the horizontal gas well adds the pseudosteady state flow period which appears after the boundary response. For gas reservoir with closed outer boundary in vertical direction and constant pressure outer boundary in horizontal direction, the pseudopressure behavior of the horizontal gas well adds the steady state flow period which appears after the boundary response. According to the characteristic lines which are manifested by pseudopressure derivative curve of each flow period, formulas are developed to obtain horizontal permeability, vertical permeability, skin factor, reservoir pressure, and pore volume of the gas reservoir, and thus the characteristic value method of well test analysis for horizontal gas well is established. Finally, the example study verifies that the new method is reliable. Characteristic value method of well test analysis for horizontal gas well makes the well test analysis

  9. Confirmatory Factor Analysis of IT-based Competency Questionnaire in Information Science & Knowledge Studies, Based on Job Market Analysis

    Directory of Open Access Journals (Sweden)

    Rahim Shahbazi

    2016-03-01

    Full Text Available The main purpose of the present research is to evaluate the validity of an IT-based competency questionnaire in Information Science & Knowledge Studies. The Survey method has been used in the present research. A data collection tool has been a researcher-made questionnaire. Statistic samples, which are 315 people, have been chosen purposefully from among Iranian faculty members, Ph.D. students, and information center employees. The findings showed that by eliminating 17 items from the whole questionnaire and Confirmatory Factor Analysis of the rest and rotating findings using the Varimax method, 8 Factors were revealed. The resulting components and also the items which had a high load factor with these components were considerably consistent with the classifications in the questionnaire and partly consistent with the findings of other researchers. 76 competency indicators (knowledge, skills, and attitudes were validated and grouped under 8 main categories: 1. “Computer Basics” 2. “Database Operating, Collection Development of Digital Resources, & Digital Library Management” 3. “Basics of Computer Networking” 4. “Basics of Programming & Database Designing” 5. “Web Designing & Web Content Analysis” 6. “Library Software & Computerized Organizing” 7. Archive of Digital Resources and 8. Attitudes.

  10. Risk factors for breast cancer-related upper extremity lymphedema: a meta-analysis

    International Nuclear Information System (INIS)

    Xie Yuhuan; Guo Qi; Liu Fenghua; Zhu Yaqun; Tian Ye

    2014-01-01

    Objective: To systematically evaluate the risk factors for upper extremity lymphedema after breast cancer treatment and the strength of their associations. Methods: PubMed, Ovid, EMbase, and the Cochrane Library were searched to identify clinical trials published up to December 2012. The quality of included studies was assessed by the Newcastle-Ottawa Scale;data analysis was performed by Stata 10.0 and RevMan 5.2; the strength of associations between risk factors and breast cancer-related upper extremity lymphedema was described as odds ratio (OR) and 95% confidence intervals (CI). Results: Twenty-two studies involving 10106 patients were included in the meta-analysis. The risk factors for upper extremity lymphedema after breast cancer treatment mainly included axillary lymph node dissection (OR=2.72, 95% CI=1.06-6.99, P=0.038), hypertension (OR=1.84, 95% CI=1.38-2.44, P=0.000), body mass index (OR=1.68, 95% CI=1.22-2.32, P=0.001), and radiotherapy (OR=1.65, 95% CI=1.20-2.25, P=0.002), while no significant associations were found for such factors as chemotherapy, age, number of positive lymph nodes, and number of dissected lymph nodes. Conclusions: The incidence of upper extremity lymphedema is high among patients with breast cancer after treatment, and axillary lymph node dissection, hypertension,body mass index, and radiotherapy are the main risk factors for lymphedema after breast cancer treatment. (authors)

  11. Risk factors for deep infection after total knee arthroplasty: a meta-analysis.

    Science.gov (United States)

    Chen, Jie; Cui, Yunying; Li, Xin; Miao, Xiangwan; Wen, Zhanpeng; Xue, Yan; Tian, Jing

    2013-05-01

    Estimated the risk factors for postoperative infection after total knee arthroplasty (TKA) to prevent its occurrence. The meta-analysis collected twelve cohorts or case-control studies which included 548 infected persons in 57,223 general cases. Review Manager 5.0 was operated to assess the heterogeneity and to give an overall estimate of the association of factors with postoperative infection after TKA. The main factors distinctly associated with infection after TKA were BMI (BMI >30: OR = 2.53, 95 % CI 1.25, 5.13; BMI >40: OR = 4.00, 95 % CI 1.23, 12.98), diabetes mellitus (OR = 3.72, 95 % CI 2.30, 6.01), hypertension (OR = 2.53, 95 % CI 1.07, 5.99), steroid therapy (OR = 2.04, 95 % CI 1.11, 3.74), and rheumatoid arthritis (OR = 1.83; 95 % CI 1.42, 2.36). It had no sufficient evidences to reveal that gender could lead to infection after TKA. Osteoarthritis appeared to have a moderately protective effect. Statistical analysis revealed no correlation between urinary tract infection, fixation method, ASA, bilateral operation, age, transfusion, antibiotics, bone graft, and infection. There were positive evidences for some certain factors which could be targeted for prevention of the onset of infection, but more studies are needed to define the association of some other controversial factors in infection, like osteoarthritis, gender and so on. The quality of studies also needs to be improved.

  12. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  13. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  14. Sparse Probabilistic Parallel Factor Analysis for the Modeling of PET and Task-fMRI Data

    DEFF Research Database (Denmark)

    Beliveau, Vincent; Papoutsakis, Georgios; Hinrich, Jesper Løve

    2017-01-01

    Modern datasets are often multiway in nature and can contain patterns common to a mode of the data (e.g. space, time, and subjects). Multiway decomposition such as parallel factor analysis (PARAFAC) take into account the intrinsic structure of the data, and sparse versions of these methods improv...

  15. A method to adjust radiation dose-response relationships for clinical risk factors

    DEFF Research Database (Denmark)

    Appelt, Ane Lindegaard; Vogelius, Ivan R

    2012-01-01

    Several clinical risk factors for radiation induced toxicity have been identified in the literature. Here, we present a method to quantify the effect of clinical risk factors on radiation dose-response curves and apply the method to adjust the dose-response for radiation pneumonitis for patients...

  16. Sensitivity analysis of an environmental model: an application of different analysis methods

    International Nuclear Information System (INIS)

    Campolongo, Francesca; Saltelli, Andrea

    1997-01-01

    A parametric sensitivity analysis (SA) was conducted on a well known model for the production of a key sulphur bearing compound from algal biota. The model is of interest because of the climatic relevance of the gas (dimethylsulphide, DMS), an initiator of cloud particles. A screening test at low sample size is applied first (Morris method) followed by a computationally intensive variance based measure. Standardised regression coefficients are also computed. The various SA measures are compared with each other, and the use of bootstrap is suggested to extract empirical confidence bounds on the SA estimators. For some of the input factors, investigators guess about the parameters relevance was confirmed; for some others, the results shed new light on the system mechanism and on the data parametrisation

  17. Q resolution calculation of small angle neutron scattering spectrometer and analysis of form factor

    International Nuclear Information System (INIS)

    Chen Liang; Peng Mei; Wang Yan; Sun Liangwei; Chen Bo

    2011-01-01

    The calculational methods of Small Angle Neutron Scattering (SANS) spectrometer Q resolution function and its correlative Q standard difference were introduced. The effects of Q standard difference were analysed with the geometry lay out of spectrometer and the spread of neutron wavelength. The one dimension Q resolution Gaussian function were analysed. The form factor curve of ideal solid sphere and two different instrument arrangement parameter was convoluted respectively and the different smearing curve of form factor was obtained. The combination of using the Q resolution function to more accurately analysis SANS data. (authors)

  18. MMOSA – A new approach of the human and organizational factor analysis in PSA

    International Nuclear Information System (INIS)

    Farcasiu, M.; Prisecaru, I.

    2014-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to nuclear installations failure. This paper is intended to analyze both the human performance importance in PSA studies and the elements that influence it. Starting from Man–Machine–Organization System (MMOS) concept a new approach (MMOSA) was developed to allow an explicit incorporation of the human and organizational factor in PSA studies. This method uses old techniques from Human Reliability Analysis (HRA) methods (THERP, SPAR-H) and new techniques to analyze human performance. The main novelty included in MMOSA is the identification of the machine–organization interfaces (maintenance, modification and aging management plan and state of man–machine interface) and the human performance evaluation based on them. A detailed result of the Human Performance Analysis (HPA) using the MMOSA methodology can identify any serious deficiencies of human performance which can usually be corrected through the improvement of the related MMOS interfaces. - Highlights: • MMOSA allows the incorporation of the human and organizational factor in PSA. • The method uses old techniques and new techniques to analyze human performance. • The main novelty is the identification of the machine–organization interfaces. • The MMOSA methodology identifies any serious deficiencies which can be corrected

  19. A human error taxonomy and its application to an automatic method accident analysis

    International Nuclear Information System (INIS)

    Matthews, R.H.; Winter, P.W.

    1983-01-01

    Commentary is provided on the quantification aspects of human factors analysis in risk assessment. Methods for quantifying human error in a plant environment are discussed and their application to system quantification explored. Such a programme entails consideration of the data base and a taxonomy of factors contributing to human error. A multi-levelled approach to system quantification is proposed, each level being treated differently drawing on the advantages of different techniques within the fault/event tree framework. Management, as controller of organization, planning and procedure, is assigned a dominant role. (author)

  20. Applications of factor analysis to electron and ion beam surface techniques

    International Nuclear Information System (INIS)

    Solomon, J.S.

    1987-01-01

    Factor analysis, a mathematical technique for extracting chemical information from matrices of data, is used to enhance Auger electron spectroscopy (AES), core level electron energy loss spectroscopy (EELS), ion scattering spectroscopy (ISS), and secondary ion mass spectroscopy (SIMS) in studies of interfaces, thin films, and surfaces. Several examples of factor analysis enhancement of chemical bonding variations in thin films and at interfaces studied with AES and SIMS are presented. Factor analysis is also shown to be of great benefit in quantifying electron and ion beam doses required to induce surface damage. Finally, examples are presented of the use of factor analysis to reconstruct elemental profiles when peaks of interest overlap each other during the course of depth profile analysis. (author)

  1. Cytogenetic method of determining effect of threshold values of anthropogenic factors on the plant and animal genome

    International Nuclear Information System (INIS)

    Arkhipchuk, V.V.; Romanenko, V.D.; Arkhipchuk, M.V.; Kipnis, L.S.

    1993-01-01

    The use of nucleolar characteristics to access the action of physical and chemical factors on living objects is a promising trend in the creation of new and highly sensitive biological tests. The advantages of this process are that the effect of the threshold values of the anthropogenic factors is recorded as a change in functional activity of the cell genome and not as the restructuring of the karyotype. The aim of this research was to test a cytogenetic method of determining the modifying action of various factors on the plant and animal genome, based on analysis of quantitative characteristics of the nucleoli and to extend its use to different groups of organisms

  2. Analysis Of Factors Causing Delays On Harun Nafsi - Hm Rifadin Street In Samarinda East Kalimantan Maintenance Project

    Directory of Open Access Journals (Sweden)

    Fadli

    2017-12-01

    Full Text Available This study aims to identify analyze and describe the factors that affect the project maintenance delay on Harun Nafsi - HM. Rifadin Street in Samarinda East Kalimantan. This research uses qualitative research method by utilizing questionnaires. The 30 participating respondents consist of 14 project implementers and 16 field implementers. The data are analyzed by descriptive statistical technique factor analysis and linear regression analysis. The results show that the factors influencing the delay of maintenance project of Harun Nafis - HM Rifadin Street include 1 time factor and workmanship factor 2 human resources and natural factors 3 geographical conditions late approval plans change and labor strikes and 4 non-optimal working levels and changes in the scope of the project during the work are still ongoing. Based on multiple linear regression analysis coefficient of determination value of 0.824 is obtained. It means that the four factors studied affect 82.4 of project delays and the rest of 27.6 is influenced by other variables out of this study. The results of this study also indicate that the dominant factor for road maintenance project delays is the fourth factor of the factors mentioned. The effort that the contractor needs to undertake is not to expand the employment contract if the project is underway or the contractor does not have the capability to complete another project.

  3. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  4. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  5. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  6. Analysis of risk factors and risk assessment for ischemic stroke recurrence

    Directory of Open Access Journals (Sweden)

    Xiu-ying LONG

    2016-08-01

    Full Text Available Objective To screen the risk factors for recurrence of ischemic stroke and to assess the risk of recurrence. Methods Essen Stroke Risk Score (ESRS was used to evaluate the risk of recurrence in 176 patients with ischemic stroke (96 cases of first onset and 80 cases of recurrence. Univariate and multivariate stepwise Logistic regression analysis was used to screen risk factors for recurrence of ischemic stroke.  Results There were significant differences between first onset group and recurrence group on age, the proportion of > 75 years old, hypertension, diabetes, coronary heart disease, peripheral angiopathy, transient ischemic attack (TIA or ischemic stroke, drinking and ESRS score (P < 0.05, for all. First onset group included one case of ESRS 0 (1.04%, 8 cases of 1 (8.33%, 39 cases of 2 (40.63%, 44 cases of 3 (45.83%, 4 cases of 4 (4.17%. Recurrence group included 2 cases of ESRS 3 (2.50%, 20 cases of 4 (25% , 37 cases of 5 (46.25% , 18 cases of 6 (22.50% , 3 cases of 7 (3.75% . There was significant difference between 2 groups (Z = -11.376, P = 0.000. Logistic regression analysis showed ESRS > 3 score was independent risk factor for recurrence of ischemic stroke (OR = 31.324, 95%CI: 3.934-249.430; P = 0.001.  Conclusions ESRS > 3 score is the independent risk factor for recurrence of ischemic stroke. It is important to strengthen risk assessment of recurrence of ischemic stroke. To screen and control risk factors is the key to secondary prevention of ischemic stroke. DOI: 10.3969/j.issn.1672-6731.2016.07.011

  7. Evaluation of ASME code flaw analysis procedure using the influence function method for application to PWR primary piping

    International Nuclear Information System (INIS)

    Hong, S.Y.; Yeater, M.L.

    1985-01-01

    This paper discusses stress intensity factor calculations and fatigue analysis for a PWR primary coolant piping system. The influence function method is applied to evaluate ASME Code Section XI Appendix A ''analysis of flaw indication'' for the application to a PWR primary piping. Results of the analysis are discussed in detail. (orig.)

  8. Biological stability of drinking water: controlling factors, methods and challenges

    Directory of Open Access Journals (Sweden)

    Emmanuelle ePrest

    2016-02-01

    Full Text Available Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g. development of opportunistic pathogens, aesthetic (e.g. deterioration of taste, odour, colour or operational (e.g. fouling or biocorrosion of pipes problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors such as (i type and concentration of available organic and inorganic nutrients, (ii type and concentration of residual disinfectant, (iii presence of predators such as protozoa and invertebrates, (iv environmental conditions such as water temperature, and (v spatial location of microorganisms (bulk water, sediment or biofilm. Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i existing knowledge on biological stability controlling factors and (ii how these factors are affected by drinking water production and distribution conditions. In addition, (iii the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discuss how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order to

  9. Biological Stability of Drinking Water: Controlling Factors, Methods, and Challenges

    Science.gov (United States)

    Prest, Emmanuelle I.; Hammes, Frederik; van Loosdrecht, Mark C. M.; Vrouwenvelder, Johannes S.

    2016-01-01

    Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g., development of opportunistic pathogens), aesthetic (e.g., deterioration of taste, odor, color) or operational (e.g., fouling or biocorrosion of pipes) problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors, such as (i) type and concentration of available organic and inorganic nutrients, (ii) type and concentration of residual disinfectant, (iii) presence of predators, such as protozoa and invertebrates, (iv) environmental conditions, such as water temperature, and (v) spatial location of microorganisms (bulk water, sediment, or biofilm). Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability) in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i) existing knowledge on biological stability controlling factors and (ii) how these factors are affected by drinking water production and distribution conditions. In addition, (iii) the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discussed, how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order

  10. Biological Stability of Drinking Water: Controlling Factors, Methods, and Challenges

    KAUST Repository

    Prest, Emmanuelle I.

    2016-02-01

    Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g., development of opportunistic pathogens), aesthetic (e.g., deterioration of taste, odor, color) or operational (e.g., fouling or biocorrosion of pipes) problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors, such as (i) type and concentration of available organic and inorganic nutrients, (ii) type and concentration of residual disinfectant, (iii) presence of predators, such as protozoa and invertebrates, (iv) environmental conditions, such as water temperature, and (v) spatial location of microorganisms (bulk water, sediment, or biofilm). Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability) in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i) existing knowledge on biological stability controlling factors and (ii) how these factors are affected by drinking water production and distribution conditions. In addition, (iii) the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discussed, how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order

  11. ORIONR: A simple and effective method for systemic analysis of clinical events and precursors occurring in hospital practice

    International Nuclear Information System (INIS)

    Debouck, F.; Petit, H.; Ravinet, L.; Rieger, E.; Noel, G.

    2012-01-01

    Purpose. - Morbi-mortality review is now recommended by the French Health Authority (Haute Autorite de sante [HAS]) in all hospital settings. It could be completed by Comites de retour d'experience (CREX), making systemic analysis of event precursors which may potentially result in medical damage. As commonly captured by their current practice, medical teams may not favour systemic analysis of events occurring in their setting. They require an easy-to-use method, more or less intuitive and easy-to-learn. It is the reason why ORION R has been set up. Methods. - ORION R is based on experience acquired in aeronautics which is the main precursor in risk management since aircraft crashes are considered as unacceptable even though the mortality from aircraft crashes is extremely low compared to the mortality from medical errors in hospital settings. The systemic analysis is divided in six steps: (i) collecting data, (ii) rebuilding the chronology of facts, (iii) identifying the gaps, (iv) identifying contributing and influential factors, (v) proposing actions to put in place, (vi) writing the analysis report. When identifying contributing and influential factors, four kinds of factors favouring the event are considered: technical domain, working environment, organisation and procedures, human factors. Although they are essentials, human factors are not always considered correctly. The systemic analysis is done by a pilot, chosen among people trained to use the method, querying information from all categories of people acting in the setting. Results. - ORION R is now used in more than 400 French hospital settings for systemic analysis of either morbi-mortality cases or event precursors. It is used, in particular, in 145 radiotherapy centres for supporting CREX. Conclusion. - As very simple to use and quasi-intuitive, ORION R is an asset to reach the objectives defined by HAS: to set up effective morbi-mortality reviews (RMM) and CREX for improving the quality of care in

  12. Cross-Cultural Validation of the Modified Practice Attitudes Scale: Initial Factor Analysis and a New Factor Model.

    Science.gov (United States)

    Park, Heehoon; Ebesutani, Chad K; Chung, Kyong-Mee; Stanick, Cameo

    2018-01-01

    The objective of this study was to create the Korean version of the Modified Practice Attitudes Scale (K-MPAS) to measure clinicians' attitudes toward evidence-based treatments (EBTs) in the Korean mental health system. Using 189 U.S. therapists and 283 members from the Korean mental health system, we examined the reliability and validity of the MPAS scores. We also conducted the first exploratory and confirmatory factor analysis on the MPAS and compared EBT attitudes across U.S. and Korean therapists. Results revealed that the inclusion of both "reversed-worded" and "non-reversed-worded" items introduced significant method effects that compromised the integrity of the one-factor MPAS model. Problems with the one-factor structure were resolved by eliminating the "non-reversed-worded" items. Reliability and validity were adequate among both Korean and U.S. therapists. Korean therapists also reported significantly more negative attitudes toward EBTs on the MPAS than U.S. therapists. The K-MPAS is the first questionnaire designed to measure Korean service providers' attitudes toward EBTs to help advance the dissemination of EBTs in Korea. The current study also demonstrated the negative impacts that can be introduced by incorporating oppositely worded items into a scale, particularly with respect to factor structure and detecting significant group differences.

  13. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    Science.gov (United States)

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  14. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    Science.gov (United States)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  15. Risk factors for technical failure of endoscopic double self-expandable metallic stent placement by partial stent-in-stent method.

    Science.gov (United States)

    Kawakubo, Kazumichi; Kawakami, Hiroshi; Toyokawa, Yoshihide; Otani, Koichi; Kuwatani, Masaki; Abe, Yoko; Kawahata, Shuhei; Kubo, Kimitoshi; Kubota, Yoshimasa; Sakamoto, Naoya

    2015-01-01

    Endoscopic double self-expandable metallic stent (SEMS) placement by the partial stent-in-stent (PSIS) method has been reported to be useful for the management of unresectable hilar malignant biliary obstruction. However, it is technically challenging, and the optimal SEMS for the procedure remains unknown. The aim of this study was to identify the risk factors for technical failure of endoscopic double SEMS placement for unresectable malignant hilar biliary obstruction (MHBO). Between December 2009 and May 2013, 50 consecutive patients with MHBO underwent endoscopic double SEMS placement by the PSIS method. We retrospectively evaluated the rate of successful double SEMS placement and identified the risk factors for technical failure. The technical success rate for double SEMS placement was 82.0% (95% confidence interval [CI]: 69.2-90.2). On univariate analysis, the rate of technical failure was high in patients with metastatic disease and unilateral placement. Multivariate analysis revealed that metastatic disease was a significant risk factor for technical failure (odds ratio: 9.63, 95% CI: 1.11-105.5). The subgroup analysis after double guidewire insertion showed that the rate of technical success was higher in the laser-cut type SEMS with a large mesh and thick delivery system than in the braided type SEMS with a small mesh and thick delivery system. Metastatic disease was a significant risk factor for technical failure of double SEMS placement for unresectable MHBO. The laser-cut type SEMS with a large mesh and thin delivery system might be preferable for the PSIS procedure. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  16. Combining structural-thermal coupled field FE analysis and the Taguchi method to evaluate the relative contributions of multi-factors in a premolar adhesive MOD restoration.

    Science.gov (United States)

    Lin, Chun-Li; Chang, Yen-Hsiang; Lin, Yi-Feng

    2008-08-01

    The aim of this study was to determine the relative contribution of changes in restorative material, cavity dimensions, adhesive layer adaptation, and load conditions on the biomechanical response of an adhesive Class II MOD restoration during oral temperature changes. A validated finite-element (FE) model was used to perform the structural-thermal coupled field analyses and the Taguchi method was employed to identify the significance of each design factor in controlling the stress. The results indicated that thermal expansion in restorative material amplified the thermal effect and dominated the tooth stress value (69%) at high temperatures. The percentage contributions of the load conditions, cavity depth, and cement modulus increased the effect on tooth stress values 46%, 32%, and 14%, respectively, when the tooth temperature was returned to 37 degrees C. Load conditions were also the main factor influencing the resin cement stress values, irrespective of temperature changes. Increased stress values occurred with composite resin, lateral force, a deeper cavity, and a higher luting cement modulus. The combined use of FE analysis and the Taguchi method efficiently identified that a deeper cavity might increase the risk of a restored tooth fracture, as well as a ceramic inlay with a lower thermal expansion, attaining a proper occlusal adjustment to reduce the lateral occlusal force and low modulus luting material application to obtain a better force-transmission mechanism are recommended.

  17. Radiotherapy for carcinoma of the vagina. Immunocytochemical and cytofluorometric analysis of prognostic factors

    Energy Technology Data Exchange (ETDEWEB)

    Blecharz, P. [Maria Sklodowska-Curie Memorial Institute, Krakow (Poland). Dept. of Gynecological Oncology; Reinfuss, M.; Jakubowicz, J. [Maria Sklodowska-Curie Memorial Institute, Krakow (Poland). Dept. of Radiation Oncology; Rys, J. [Maria Sklodowska-Curie Memorial Institute, Krakow (Poland). Dept. of Tumor Pathology Oncology; Skotnicki, P.; Wysocki, W. [Maria Sklodowska-Curie Memorial Institute, Krakow (Poland). Dept. of Oncological Surgery

    2013-05-15

    Background and purpose: The aim of this study was to assess the potential prognostic factors in patients with primary invasive vaginal carcinoma (PIVC) treated with radical irradiation. Patients and methods: The analysis was performed on 77 patients with PIVC treated between 1985 and 2005 in the Maria Sklodowska-Curie Memorial Institute of Oncology, Cancer Center in Krakow. A total of 36 patients (46.8 %) survived 5 years with no evidence of disease (NED). The following groups of factors were assessed for potential prognostic value: population-based (age), clinical (Karnofsky Performance Score [KPS], hemoglobin level, primary location of the vaginal lesion, macroscopic type, length of the involved vaginal wall, FIGO stage), microscopic (microscopic type, grade, mitotic index, presence of atypical mitoses, lymphatic vessels invasion, lymphocytes/plasmocytes infiltration, focal necrosis, VAIN-3), immunohistochemical (protein p53 expression, MIB-1 index), cytofluorometric (ploidity, index DI, S-phase fraction, proliferation index SG2M) factors. Results: Significantly better 5-year NED was observed in patients: < 60 years, KPS {<=} 80, FIGO stage I and II, grade G1-2, MIB-1 index < 70, S-phase fraction < 10, and proliferation index < 25. Independent factors for better prognosis in the multivariate Cox analysis were age < 60 years, FIGO stage I or II, and MIB-1 index < 70. Conclusion: Independent prognostic factors in the radically irradiated PIVC patients were as follows: age, FIGO stage, MIB-1 index. (orig.)

  18. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  19. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  20. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  1. Contraceptive Method Choice Among Young Adults: Influence of Individual and Relationship Factors.

    Science.gov (United States)

    Harvey, S Marie; Oakley, Lisa P; Washburn, Isaac; Agnew, Christopher R

    2018-01-26

    Because decisions related to contraceptive behavior are often made by young adults in the context of specific relationships, the relational context likely influences use of contraceptives. Data presented here are from in-person structured interviews with 536 Black, Hispanic, and White young adults from East Los Angeles, California. We collected partner-specific relational and contraceptive data on all sexual partnerships for each individual, on four occasions, over one year. Using three-level multinomial logistic regression models, we examined individual and relationship factors predictive of contraceptive use. Results indicated that both individual and relationship factors predicted contraceptive use, but factors varied by method. Participants reporting greater perceived partner exclusivity and relationship commitment were more likely to use hormonal/long-acting methods only or a less effective method/no method versus condoms only. Those with greater participation in sexual decision making were more likely to use any method over a less effective method/no method and were more likely to use condoms only or dual methods versus a hormonal/long-acting method only. In addition, for women only, those who reported greater relationship commitment were more likely to use hormonal/long-acting methods or a less effective method/no method versus a dual method. In summary, interactive relationship qualities and dynamics (commitment and sexual decision making) significantly predicted contraceptive use.

  2. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  3. Slepian modeling as a computational method in random vibration analysis of hysteretic structures

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Tarp-Johansen, Niels Jacob

    1999-01-01

    white noise. The computation time for obtaining estimates of relevant statistics on a given accuracy level is decreased by factors of one ormore orders of size as compared to the computation time needed for direct elasto-plastic displacementresponse simulations by vectorial Markov sequence techniques....... Moreover the Slepian method gives valuablephysical insight about the details of the plastic displacement development by time.The paper gives a general self-contained mathematical description of the Slepian method based plasticdisplacement analysis of Gaussian white noise excited EPOs. Experiences...

  4. Methods for seismic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Gantenbein, F.

    1990-01-01

    The seismic analysis of a complex structure, such as a nuclear power plant, is done in various steps. An overview of the methods, used in each of these steps will be given in the following chapters: Seismic analysis of the buildings taking into account structures with important mass or stiffness. The input to the building analysis, called ground motion, is described by an accelerogram or a response spectra. In this step, soil structure interaction has to be taken into account. Various methods are available: Impedance, finite element. The response of the structure can be calculated by spectral method or by time history analysis; advantages and limitations of each method will be shown. Calculation of floor response spectrum which are the data for the equipment analysis. Methods to calculate this spectrum will be described. Seismic analysis of the equipments. Presentation of the methods for both monosupported and multisupported equipment will be given. In addition methods to analyse equipments which present non-linearities associated to the boundary conditions such as impacts, sliding will be presented. (author). 30 refs, 15 figs

  5. Assessing the Credit Risk of Corporate Bonds Based on Factor Analysis and Logistic Regress Analysis Techniques: Evidence from New Energy Enterprises in China

    Directory of Open Access Journals (Sweden)

    Yuanxin Liu

    2018-05-01

    Full Text Available In recent years, new energy sources have ushered in tremendous opportunities for development. The difficulties to finance new energy enterprises (NEEs can be estimated through issuing corporate bonds. However, there are few scientific and reasonable methods to assess the credit risk of NEE bonds, which is not conducive to the healthy development of NEEs. Based on this, this paper analyzes the advantages and risks of NEEs issuing bonds and the main factors affecting the credit risk of NEE bonds, constructs a hybrid model for assessing the credit risk of NEE bonds based on factor analysis and logistic regress analysis techniques, and verifies the applicability and effectiveness of the model employing relevant data from 46 Chinese NEEs. The results show that the main factors affecting the credit risk of NEE bonds are internal factors involving the company’s profitability, solvency, operational ability, growth potential, asset structure and viability, and external factors including macroeconomic environment and energy policy support. Based on the empirical results and the exact situation of China’s NEE bonds, this article finally puts forward several targeted recommendations.

  6. Working Posture Analysis Methods and the Effects of Working Posture on Musculoskeletal Disorders

    Directory of Open Access Journals (Sweden)

    Hatice Esen

    2013-01-01

    Full Text Available Musculoskeletal Disorders (MSDs which cause great health problems and social resource consumption are common problems which commonly influence working population. MSDs which is at the top of the list in the sense of health problems, expenses made for these disorders and which has negative influences in the sense of employee labor efficiency, quality of life, physical and social functions results from poor working postures. Observation, analysis of working postures with scientific methods, and making necessary recoveries and arrangements bring important contributions for control of working performance and decrease of MSDs. In this study, risk factors which cause the emergence of MSDs, types and symptoms of disorders are summarized, basic principles to be used in preventing these disorders are presented and scientific methods used in determination of risk factors are classified and presented.

  7. Choice of Postpartum Contraception: Factors Predisposing Pregnant Adolescents to Choose Less Effective Methods Over Long-Acting Reversible Contraception.

    Science.gov (United States)

    Chacko, Mariam R; Wiemann, Constance M; Buzi, Ruth S; Kozinetz, Claudia A; Peskin, Melissa; Smith, Peggy B

    2016-06-01

    The purposes were to determine contraceptive methods pregnant adolescents intend to use postpartum and to understand factors that predispose intention to use less effective birth control than long-acting reversible contraception (LARC). Participants were 247 pregnant minority adolescents in a prenatal program. Intention was assessed by asking "Which of the following methods of preventing pregnancy do you intend to use after you deliver?" Multinomial logistic regression analysis was used to determine factors associated with intent to use nonhormonal (NH) contraception (male/female condoms, abstinence, withdrawal and no method) or short-/medium-acting hormonal (SMH) contraception (birth control pill, patch, vaginal ring, injectable medroxyprogesterone acetate) compared with LARC (implant and intrauterine device) postpartum. Twenty-three percent intended to use LARC, 53% an SMH method, and 24% an NH method. Participants who intended to use NH or SMH contraceptive methods over LARC were significantly more likely to believe that LARC is not effective at preventing pregnancy, to report that they do not make decisions to help reach their goals and that partners are not important when making contraceptive decisions. Other important factors were having a mother who was aged >19 years at first birth and had not graduated from high school, not having experienced a prior pregnancy or talked with parents about birth control options, and the perception of having limited financial resources. Distinct profiles of factors associated with intending to use NH or SMH contraceptive methods over LARC postpartum were identified and may inform future interventions to promote the use of LARC to prevent repeat pregnancy. Copyright © 2015 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  8. Multivariate analysis method for energy calibration and improved mass assignment in recoil spectrometry

    International Nuclear Information System (INIS)

    El Bouanani, Mohamed; Hult, Mikael; Persson, Leif; Swietlicki, Erik; Andersson, Margaretha; Oestling, Mikael; Lundberg, Nils; Zaring, Carina; Cohen, D.D.; Dytlewski, Nick; Johnston, P.N.; Walker, S.R.; Bubb, I.F.; Whitlow, H.J.

    1994-01-01

    Heavy ion recoil spectrometry is rapidly becoming a well established analysis method, but the associated data analysis processing is still not well developed. The pronounced nonlinear response of silicon detectors for heavy ions leads to serious limitation and complication in mass gating, which is the principal factor in obtaining energy spectra with minimal cross talk between elements. To overcome the above limitation, a simple empirical formula with an associated multiple regression method is proposed for the absolute energy calibration of the time of flight-energy dispersive detector telescope used in recoil spectrometry. A radical improvement in mass assignment was realized, which allows a more accurate and improved depth profiling with the important feature of making the data processing much easier. ((orig.))

  9. Multiway analysis methods applied to the fluorescence excitation-emission dataset for the simultaneous quantification of valsartan and amlodipine in tablets

    Science.gov (United States)

    Dinç, Erdal; Ertekin, Zehra Ceren; Büker, Eda

    2017-09-01

    In this study, excitation-emission matrix datasets, which have strong overlapping bands, were processed by using four different chemometric calibration algorithms consisting of parallel factor analysis, Tucker3, three-way partial least squares and unfolded partial least squares for the simultaneous quantitative estimation of valsartan and amlodipine besylate in tablets. In analyses, preliminary separation step was not used before the application of parallel factor analysis Tucker3, three-way partial least squares and unfolded partial least squares approaches for the analysis of the related drug substances in samples. Three-way excitation-emission matrix data array was obtained by concatenating excitation-emission matrices of the calibration set, validation set, and commercial tablet samples. The excitation-emission matrix data array was used to get parallel factor analysis, Tucker3, three-way partial least squares and unfolded partial least squares calibrations and to predict the amounts of valsartan and amlodipine besylate in samples. For all the methods, calibration and prediction of valsartan and amlodipine besylate were performed in the working concentration ranges of 0.25-4.50 μg/mL. The validity and the performance of all the proposed methods were checked by using the validation parameters. From the analysis results, it was concluded that the described two-way and three-way algorithmic methods were very useful for the simultaneous quantitative resolution and routine analysis of the related drug substances in marketed samples.

  10. Influence factors and corrections of low-energy γ-ray penetration in ash analysis

    International Nuclear Information System (INIS)

    Cheng Bo; Tuo Xianguo; Zhou Jianbin; Tong Yunfu

    2002-01-01

    The author introduces the system of the coal ash analyzer. This system is based on the low-energy γ-ray source 241 Am emitted two kinds of energy peaks 26.4 keV and 59.6 keV to analyze the ash in coal with the penetration way. The author also offers the factors to influence the accuracy of ash analysis, such as the size of coal, the environmental temperature, the important elements in coal, and water in coal too. At the same time, depending on the cause of the factors, it offer some methods of correction such as the way of the auto-hold energy peak, the way of the auto-compensation way, and so on. The author also mentions the other influence factors of the measurement accuracy to be noticed during the experiment. All these aim at clearing off the influence factors of measurement accuracy through the experiments

  11. Decision making model design for antivirus software selection using Factor Analysis and Analytical Hierarchy Process

    OpenAIRE

    Nurhayati Ai; Gautama Aditya; Naseer Muchammad

    2018-01-01

    Virus spread increase significantly through the internet in 2017. One of the protection method is using antivirus software. The wide variety of antivirus software in the market tends to creating confusion among consumer. Selecting the right antivirus according to their needs has become difficult. This is the reason we conduct our research. We formulate a decision making model for antivirus software consumer. The model is constructed by using factor analysis and AHP method. First we spread que...

  12. Factors affecting the HIV/AIDS epidemic: An ecological analysis of ...

    African Journals Online (AJOL)

    Factors affecting the HIV/AIDS epidemic: An ecological analysis of global data. ... Backward multiple linear regression analysis identified the proportion of Muslims, physicians density, and adolescent fertility rate are as the three most prominent factors linked with the national HIV epidemic. Conclusions: The findings support ...

  13. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  14. Multi-factor analysis on events related to hematological toxicity in 153Sm-EDTMP palliative therapy for skeletal metastases

    International Nuclear Information System (INIS)

    Zhan Hongwei; Yu Xiaoling; Ye Xiaojuan; Bao Chengkan; Sun Da; He Gangqiang

    2006-01-01

    Objective: To investigate the clinical factors related to hematological toxicity induced by intravenous samarium-153 ethylenediaminetetramethylene phosphonic acid ( 153 Sm-EDTMP) treatment. Methods A total of 206 patients with bony metastases treated with 153 Sm-EDTMP were retrospectively analyzed. Logistic regression (SPSS 10.0 for Windows) and correlation analysis were used to evaluate the factors concerned. Results: Age of the patient, number of bone metastatic lesion, chemotherapy before 153 Sm-EDTMP therapy, concurrent radiotherapy and repeat-times of 153 Sm-EDTMP treatments were found the individual factors related to hematological toxicity. Chemotherapy before 153 Sm-EDTMP, concurrent radiotherapy, medication for normal blood counting and repeat-times of 153 Sm-EDTMP treatments were the hematological toxicity factors in multi-factor analysis. Conclusion: In 153 Sm-EDTMP therapy, several factors were found related to hematological toxicity suggesting more attention be paid to the change of blood cell counting after the palliative therapy. (authors)

  15. Exploratory factor analysis of the 12-item Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being Scale in people newly diagnosed with advanced cancer.

    Science.gov (United States)

    Bai, Mei; Dixon, Jane K

    2014-01-01

    The purpose of this study was to reexamine the factor pattern of the 12-item Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being Scale (FACIT-Sp-12) using exploratory factor analysis in people newly diagnosed with advanced cancer. Principal components analysis (PCA) and 3 common factor analysis methods were used to explore the factor pattern of the FACIT-Sp-12. Factorial validity was assessed in association with quality of life (QOL). Principal factor analysis (PFA), iterative PFA, and maximum likelihood suggested retrieving 3 factors: Peace, Meaning, and Faith. Both Peace and Meaning positively related to QOL, whereas only Peace uniquely contributed to QOL. This study supported the 3-factor model of the FACIT-Sp-12. Suggestions for revision of items and further validation of the identified factor pattern were provided.

  16. Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA

    Directory of Open Access Journals (Sweden)

    Jorge Arroyo-Hernández

    2016-01-01

    Full Text Available The dimensionality reduction methods are algorithms mapping the set of data in subspaces derived from the original space, of fewer dimensions, that allow a description of the data at a lower cost. Due to their importance, they are widely used in processes associated with learning machine. This article presents a comparative analysis of PCA, PPCA and KPCA dimensionality reduction methods. A reconstruction experiment of worm-shape data was performed through structures of landmarks located in the body contour, with methods having different number of main components. The results showed that all methods can be seen as alternative processes. Nevertheless, thanks to the potential for analysis in the features space and the method for calculation of its preimage presented, KPCA offers a better method for recognition process and pattern extraction

  17. Modification and analysis of engineering hot spot factor of HFETR

    International Nuclear Information System (INIS)

    Hu Yuechun; Deng Caiyu; Li Haitao; Xu Taozhong; Mo Zhengyu

    2014-01-01

    This paper presents the modification and analysis of engineering hot spot factors of HFETR. The new factors are applied in the fuel temperature analysis and the estimated value of the safety allowable operating power of HFETR. The result shows the maximum cladding temperature of the fuel is lower when the new factor are in utilization, and the safety allowable operating power of HFETR if higher, thus providing the economical efficiency of HFETR. (authors)

  18. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    Science.gov (United States)

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  19. Ranking insurance firms using AHP and Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2013-03-01

    Full Text Available Insurance industry includes a significant part of economy and it is important to learn more about the capabilities of different firms, which are active in this industry. In this paper, we present an empirical study to rank the insurance firms using analytical hierarchy process as well as factor analysis. The study considers four criteria including capital adequacy, quality of earning, quality of cash flow and quality of firms’ assets. The results of the implementation of factor analysis (FA have been verified using Kaiser-Meyer-Olkin (KMO=0.573 and Bartlett's Chi-Square (443.267 P-value=0.000 tests. According to the results FA, the first important factor, capital adequacy, represents 21.557% of total variance, the second factor, quality of income, represents 20.958% of total variance. In addition, the third factor, quality of cash flow, represents 19.417% of total variance and the last factor, quality of assets, represents 18.641% of total variance. The study has also used analytical hierarchy process (AHP to rank insurance firms. The results of our survey indicate that capital adequacy (0.559 is accounted as the most important factor followed by quality of income (0.235, quality of cash flow (0.144 and quality of assets (0.061. The results of AHP are consistent with the results of FA, which somewhat validates the overall study.

  20. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.