WorldWideScience

Sample records for factor analysis techniques

  1. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  2. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  3. Applications of factor analysis to electron and ion beam surface techniques

    International Nuclear Information System (INIS)

    Solomon, J.S.

    1987-01-01

    Factor analysis, a mathematical technique for extracting chemical information from matrices of data, is used to enhance Auger electron spectroscopy (AES), core level electron energy loss spectroscopy (EELS), ion scattering spectroscopy (ISS), and secondary ion mass spectroscopy (SIMS) in studies of interfaces, thin films, and surfaces. Several examples of factor analysis enhancement of chemical bonding variations in thin films and at interfaces studied with AES and SIMS are presented. Factor analysis is also shown to be of great benefit in quantifying electron and ion beam doses required to induce surface damage. Finally, examples are presented of the use of factor analysis to reconstruct elemental profiles when peaks of interest overlap each other during the course of depth profile analysis. (author)

  4. The development of human factors technologies -The development of human behaviour analysis techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: (1) Site investigation of operator tasks, (2) Development of operator task micro structure and revision of micro structure, (3) Development of knowledge representation software and SACOM prototype, (4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. analysis and application techniques> (1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, (2) Analysis of human error occurrences and revision of analysis procedure, (3) Experiment for human error data collection using a compact nuclear simulator, (4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author).

  5. The development of human factors technologies -The development of human behaviour analysis techniques-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: 1) Site investigation of operator tasks, 2) Development of operator task micro structure and revision of micro structure, 3) Development of knowledge representation software and SACOM prototype, 4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. 1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, 2) Analysis of human error occurrences and revision of analysis procedure, 3) Experiment for human error data collection using a compact nuclear simulator, 4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author)

  6. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  7. The development of human factors technologies -The development of human factors experimental evaluation techniques-

    International Nuclear Information System (INIS)

    Shim, Bong Sik; Oh, In Suk; Cha, Kyung Hoh; Lee, Hyun Chul

    1995-07-01

    In this year, we studied the followings: 1) Development of operator mental workload evaluation techniques, 2) Development of a prototype for preliminary human factors experiment, 3) Suitability test of information display on a large scale display panel, 4) Development of guidelines for VDU-based control room design, 5) Development of integrated test facility (ITF). 6) Establishment of an eye tracking system, and we got the following results: 1) Mental workload evaluation techniques for MMI evaluation, 2) PROTOPEX (PROTOtype for preliminary human factors experiment) for preliminary human factors experiments, 3) Usage methods of APTEA (Analysis-Prototyping-Training-Experiment-Analysis) experiment design, 4) Design guidelines for human factors verification, 5) Detail design requirements and development plan of ITF, 6) Eye movement measurement system. 38 figs, 20 tabs, 54 refs. (Author)

  8. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    Science.gov (United States)

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  10. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  11. Factors influencing patient compliance with therapeutic regimens in chronic heart failure: A critical incident technique analysis.

    Science.gov (United States)

    Strömberg, A; Broström, A; Dahlström, U; Fridlund, B

    1999-01-01

    The aim of this study was to identify factors influencing compliance with prescribed treatment in patients with chronic heart failure. A qualitative design with a critical incident technique was used. Incidents were collected through interviews with 25 patients with heart failure strategically selected from a primary health care clinic, a medical ward, and a specialist clinic. Two hundred sixty critical incidents were identified in the interviews and 2 main areas emerged in the analysis: inward factors and outward factors. The inward factors described how compliance was influenced by the personality of the patient, the disease, and the treatment. The outward factors described how compliance was influenced by social activities, social relationships, and health care professionals. By identifying the inward and outward factors influencing patients with chronic heart failure, health care professionals can assess whether intervention is needed to increase compliance.

  12. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    Science.gov (United States)

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  13. Factors and factorizations of graphs proof techniques in factor theory

    CERN Document Server

    Akiyama, Jin

    2011-01-01

    This book chronicles the development of graph factors and factorizations. It pursues a comprehensive approach, addressing most of the important results from hundreds of findings over the last century. One of the main themes is the observation that many theorems can be proved using only a few standard proof techniques. This stands in marked contrast to the seemingly countless, complex proof techniques offered by the extant body of papers and books. In addition to covering the history and development of this area, the book offers conjectures and discusses open problems. It also includes numerous explanatory figures that enable readers to progressively and intuitively understand the most important notions and proofs in the area of factors and factorization.

  14. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  15. Assessing the Credit Risk of Corporate Bonds Based on Factor Analysis and Logistic Regress Analysis Techniques: Evidence from New Energy Enterprises in China

    Directory of Open Access Journals (Sweden)

    Yuanxin Liu

    2018-05-01

    Full Text Available In recent years, new energy sources have ushered in tremendous opportunities for development. The difficulties to finance new energy enterprises (NEEs can be estimated through issuing corporate bonds. However, there are few scientific and reasonable methods to assess the credit risk of NEE bonds, which is not conducive to the healthy development of NEEs. Based on this, this paper analyzes the advantages and risks of NEEs issuing bonds and the main factors affecting the credit risk of NEE bonds, constructs a hybrid model for assessing the credit risk of NEE bonds based on factor analysis and logistic regress analysis techniques, and verifies the applicability and effectiveness of the model employing relevant data from 46 Chinese NEEs. The results show that the main factors affecting the credit risk of NEE bonds are internal factors involving the company’s profitability, solvency, operational ability, growth potential, asset structure and viability, and external factors including macroeconomic environment and energy policy support. Based on the empirical results and the exact situation of China’s NEE bonds, this article finally puts forward several targeted recommendations.

  16. The development of human factors experimental evaluation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Bong Shick; Oh, In Suk; Cha, Kyung Ho; Lee, Hyun Chul; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon

    1997-07-01

    New human factors issues, such as evaluation of information navigation, the consideration of operator characteristics, and operator performance assessment, related to the HMI design based on VDUs are being risen. Thus, in order to solve these human factors issues, this project aims to establish the experimental technologies including the techniques for experimental design, experimental measurement, data collection and analysis, and to develop ITF (Integrated Test Facility) suitable for the experiment of HMI design evaluation. For the establish of the experimental data analysis and evaluation methodologies, we developed as the following: (1) a paradigm for human factors experimentation including experimental designs, procedures, and data analysis. (2) the methods for the assessment of operator`s mental workload (3) DAEXESS (data analysis and experiment evaluation supporting system). Also, we have established a experiment execution technologies through the preliminary experiments, such as the suitability evaluation of information display on a LSDP, the evaluation of information display on a LSDP, the evaluation of computerized operation procedure and an experiment of advanced alarm system (ADIOS). Finally, we developed the ITF including human machine simulator, telemetry system, an eye tracking system, an audio/video data measurement system, and three dimensional micro behaviour analysis system. (author). 81 refs., 68 tabs., 73 figs.

  17. Identification of noise in linear data sets by factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, Ph.K.

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors. (author)

  18. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1990-01-01

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  19. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  20. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  1. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    Science.gov (United States)

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  2. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  3. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  4. Linking human factors to corporate strategy with cognitive mapping techniques.

    Science.gov (United States)

    Village, Judy; Greig, Michael; Salustri, Filippo A; Neumann, W Patrick

    2012-01-01

    For human factors (HF) to avoid being considered of "side-car" status, it needs to be positioned within the organization in such a way that it affects business strategies and their implementation. Tools are needed to support this effort. This paper explores the feasibility of applying a technique from operational research called cognitive mapping to link HF to corporate strategy. Using a single case study, a cognitive map is drawn to reveal the complex relationships between human factors and achieving an organization's strategic goals. Analysis of the map for central concepts and reinforcing loops enhances understanding that can lead to discrete initiatives to facilitate integration of HF. It is recommended that this technique be used with senior managers to understand the organizations` strategic goals and enhance understanding of the potential for HF to contribute to the strategic goals.

  5. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  6. INSTRUMENTS MEASURING PERCEIVED RACISM/RACIAL DISCRIMINATION: REVIEW AND CRITIQUE OF FACTOR ANALYTIC TECHNIQUES

    Science.gov (United States)

    Atkins, Rahshida

    2015-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis. PMID:25626225

  7. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  8. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  9. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  10. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  11. Uranium solution mining cost estimating technique: means for rapid comparative analysis of deposits

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Twelve graphs provide a technique for determining relative cost ranges for uranium solution mining projects. The use of the technique can provide a consistent framework for rapid comparative analysis of various properties of mining situations. The technique is also useful to determine the sensitivities of cost figures to incremental changes in mining factors or deposit characteristics

  12. Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors

    Science.gov (United States)

    Mahalingam, R.; Olsen, M. J.

    2014-12-01

    Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : Li

  13. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  14. An inter-battery factor analysis of the comrey personality scales and the 16 personality factor questionnaire

    OpenAIRE

    Gideon P. de Bruin

    2000-01-01

    The scores of 700 Afrikaans-speaking university students on the Comrey Personality Scales and the 16 Personality Factor Questionnaire were subjected to an inter-battery factor analysis. This technique uses only the correlations between two sets of variables and reveals only the factors that they have in common. Three of the Big Five personality factors were revealed, namely Extroversion, Neuroticism and Conscientiousness. However, the Conscientiousness factor contained a relatively strong uns...

  15. Incomplete factorization technique for positive definite linear systems

    International Nuclear Information System (INIS)

    Manteuffel, T.A.

    1980-01-01

    This paper describes a technique for solving the large sparse symmetric linear systems that arise from the application of finite element methods. The technique combines an incomplete factorization method called the shifted incomplete Cholesky factorization with the method of generalized conjugate gradients. The shifted incomplete Cholesky factorization produces a splitting of the matrix A that is dependent upon a parameter α. It is shown that if A is positive definite, then there is some α for which this splitting is possible and that this splitting is at least as good as the Jacobi splitting. The method is shown to be more efficient on a set of test problems than either direct methods or explicit iteration schemes

  16. Factors Associated With Healthcare-Acquired Catheter-Associated Urinary Tract Infections: Analysis Using Multiple Data Sources and Data Mining Techniques.

    Science.gov (United States)

    Park, Jung In; Bliss, Donna Z; Chi, Chih-Lin; Delaney, Connie W; Westra, Bonnie L

    The purpose of this study was to identify factors associated with healthcare-acquired catheter-associated urinary tract infections (HA-CAUTIs) using multiple data sources and data mining techniques. Three data sets were integrated for analysis: electronic health record data from a university hospital in the Midwestern United States was combined with staffing and environmental data from the hospital's National Database of Nursing Quality Indicators and a list of patients with HA-CAUTIs. Three data mining techniques were used for identification of factors associated with HA-CAUTI: decision trees, logistic regression, and support vector machines. Fewer total nursing hours per patient-day, lower percentage of direct care RNs with specialty nursing certification, higher percentage of direct care RNs with associate's degree in nursing, and higher percentage of direct care RNs with BSN, MSN, or doctoral degree are associated with HA-CAUTI occurrence. The results also support the association of the following factors with HA-CAUTI identified by previous studies: female gender; older age (>50 years); longer length of stay; severe underlying disease; glucose lab results (>200 mg/dL); longer use of the catheter; and RN staffing. Additional findings from this study demonstrated that the presence of more nurses with specialty nursing certifications can reduce HA-CAUTI occurrence. While there may be valid reasons for leaving in a urinary catheter, findings show that having a catheter in for more than 48 hours contributes to HA-CAUTI occurrence. Finally, the findings suggest that more nursing hours per patient-day are related to better patient outcomes.

  17. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  18. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    Science.gov (United States)

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  19. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  20. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  1. Development of human factors evaluation techniques for nuclear power plants

    International Nuclear Information System (INIS)

    Oh, I.S.; Lee, Y.H.; Lee, J.W.; Sim, B.S.

    1999-01-01

    This paper describes development of an operator task simulation analyzer and human factors evaluation techniques performed recently at Korea Atomic Energy Research Institute. The first is the SACOM (Simulation Analyzer with a Cognitive Operator Model) for the assessment of task performance by simulating control room operation. The latter has two objectives: to establish a human factors experiment facility, the Integrated Test Facility (ITF), and to establish techniques for human factors experiments. (author)

  2. Testing all six person-oriented principles in dynamic factor analysis.

    Science.gov (United States)

    Molenaar, Peter C M

    2010-05-01

    All six person-oriented principles identified by Sterba and Bauer's Keynote Article can be tested by means of dynamic factor analysis in its current form. In particular, it is shown how complex interactions and interindividual differences/intraindividual change can be tested in this way. In addition, the necessity to use single-subject methods in the analysis of developmental processes is emphasized, and attention is drawn to the possibility to optimally treat developmental psychopathology by means of new computational techniques that can be integrated with dynamic factor analysis.

  3. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  4. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  5. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  6. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  7. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  8. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  9. Confirmatory factor analysis using Microsoft Excel.

    Science.gov (United States)

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  10. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    Science.gov (United States)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  11. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  12. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  13. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  14. A virtual crack-closure technique for calculating stress intensity factors for cracked three dimensional bodies

    Science.gov (United States)

    Shivakumar, K. N.; Tan, P. W.; Newman, J. C., Jr.

    1988-01-01

    A three-dimensional virtual crack-closure technique is presented which calculates the strain energy release rates and the stress intensity factors using only nodal forces and displacements from a standard finite element analysis. The technique is an extension of the Rybicki-Kanninen (1977) method, and it assumes that any continuous function can be approximated by a finite number of straight line segments. Results obtained by the method for surface cracked plates with and without notches agree favorably with previous results.

  15. The development of human factors experimental evaluation techniques -The development of human factors technologies-

    International Nuclear Information System (INIS)

    Sim, Bong Shick; Oh, In Seok; Cha, Kyeong Ho; Lee, Hyun Chul

    1994-04-01

    In the 2nd year of the research project for the development of human factors evaluation techniques, we first defined the experimental target systems by the comparison study of the advanced control rooms proposed by foreign countries in order to make the experiment feasible and realistic for the 10 experimental items selected in the first year of the project. Then we have decided to confine our research on the big board overview panel and operator workstations. Following the development of selection criteria for our research interest, we have identified the design variables which may influence the performance of the operator by the functional analysis. The experimental variables which will be used for the evaluation of the proposed items are then defined by the relational analysis between evaluation items and design variables and they are classified by the characteristics of the measurement data. The functional requirements of ITF are developed to accommodate the necessary functions for carrying out the 10 evaluation items. The functional requirements for each sub-system of ITF have been developed with the experimental paradigm of APTEA. Finally we have reviewed the compact nuclear simulator (CNS) at KAERI from the point of view of jyman factors guidelines/principles and proposed the two possible layouts for the experimental apparatus for the evaluation of display alternative and operational procedure. (Author)

  16. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  17. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1980-01-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of the energy loss of the incident particle with penetration depth, and X-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle. (orig.)

  18. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1979-06-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of energy loss of the incident particle with penetration depth, and x-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle.(Author) [pt

  19. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    Science.gov (United States)

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  20. Analysis of stress intensity factors for surface cracks in pre/post penetration

    International Nuclear Information System (INIS)

    Miyoshi, Toshiro; Yoshida, Yuichiro

    1988-01-01

    It is important to evaluate the penetration of surface cracks in a Leak-Before-Break analysis. Because the stress intensity factors for surface cracks in pre/post penetration had not yet been analyzed, the authors carried three-dimensional boundary element analyses in order to obtain them. First, the authors developed the technique of nodal breakdown appropriate for cracks with short ligament length in a two-dimensional boundary element analysis. Next, analyses of stress intensity factor for surface cracks in pre/post penetration were carried out using the technique of nodal breakdown for cracks with short ligament length and the three-dimensional boundary element code BEM 3 D which was designed for a supercomputer. (author)

  1. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  2. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  3. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  4. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  5. Exploratory Factor Analysis With Small Samples and Missing Data.

    Science.gov (United States)

    McNeish, Daniel

    2017-01-01

    Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.

  6. Risk factor analysis of pulmonary hemorrhage complicating CT-guided lung biopsy in coaxial and non-coaxial core biopsy techniques in 650 patients

    Energy Technology Data Exchange (ETDEWEB)

    Nour-Eldin, Nour-Eldin A., E-mail: nour410@hotmail.com [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Diagnostic and Interventional Radiology Department, Cairo University Hospital, Cairo (Egypt); Alsubhi, Mohammed [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Naguib, Nagy N. [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Diagnostic and Interventional Radiology Department, Alexandria University Hospital, Alexandria (Egypt); Lehnert, Thomas; Emam, Ahmed; Beeres, Martin; Bodelle, Boris; Koitka, Karen; Vogl, Thomas J.; Jacobi, Volkmar [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany)

    2014-10-15

    Purpose: To evaluate the risk factors involved in the development of pulmonary hemorrhage complicating CT-guided biopsy of pulmonary lesions in coaxial and non-coaxial techniques. Materials and methods: Retrospective study included CT-guided percutaneous lung biopsies in 650 consecutive patients (407 males, 243 females; mean age 54.6 years, SD: 5.2) from November 2008 to June 2013. Patients were classified according to lung biopsy technique in coaxial group (318 lesions) and non-coaxial group (332 lesions). Exclusion criteria for biopsy were: lesions <5 mm in diameter, uncorrectable coagulopathy, positive-pressure ventilation, severe respiratory compromise, pulmonary arterial hypertension or refusal of the procedure. Risk factors for pulmonary hemorrhage complicating lung biopsy were classified into: (a) patient's related risk factors, (b) lesion's related risk factors and (d) technical risk factors. Radiological assessments were performed by two radiologists in consensus. Mann–Whitney U test and Fisher's exact tests for statistical analysis. p values <0.05 were considered statistically significant. Results: Incidence of pulmonary hemorrhage was 19.6% (65/332) in non-coaxial group and 22.3% (71/318) in coaxial group. The difference in incidence between both groups was statistically insignificant (p = 0.27). Hemoptysis developed in 5.4% (18/332) and in 6.3% (20/318) in the non-coaxial and coaxial groups respectively. Traversing pulmonary vessels in the needle biopsy track was a significant risk factor of the development pulmonary hemorrhage (incidence: 55.4% (36/65, p = 0.0003) in the non-coaxial group and 57.7% (41/71, p = 0.0013) in coaxial group). Other significant risk factors included: lesions of less than 2 cm (p value of 0.01 and 0.02 in non-coaxial and coaxial groups respectively), basal and middle zonal lesions in comparison to upper zonal lung lesions (p = 0.002 and 0.03 in non-coaxial and coaxial groups respectively), increased lesion

  7. Risk factor analysis of pulmonary hemorrhage complicating CT-guided lung biopsy in coaxial and non-coaxial core biopsy techniques in 650 patients

    International Nuclear Information System (INIS)

    Nour-Eldin, Nour-Eldin A.; Alsubhi, Mohammed; Naguib, Nagy N.; Lehnert, Thomas; Emam, Ahmed; Beeres, Martin; Bodelle, Boris; Koitka, Karen; Vogl, Thomas J.; Jacobi, Volkmar

    2014-01-01

    Purpose: To evaluate the risk factors involved in the development of pulmonary hemorrhage complicating CT-guided biopsy of pulmonary lesions in coaxial and non-coaxial techniques. Materials and methods: Retrospective study included CT-guided percutaneous lung biopsies in 650 consecutive patients (407 males, 243 females; mean age 54.6 years, SD: 5.2) from November 2008 to June 2013. Patients were classified according to lung biopsy technique in coaxial group (318 lesions) and non-coaxial group (332 lesions). Exclusion criteria for biopsy were: lesions <5 mm in diameter, uncorrectable coagulopathy, positive-pressure ventilation, severe respiratory compromise, pulmonary arterial hypertension or refusal of the procedure. Risk factors for pulmonary hemorrhage complicating lung biopsy were classified into: (a) patient's related risk factors, (b) lesion's related risk factors and (d) technical risk factors. Radiological assessments were performed by two radiologists in consensus. Mann–Whitney U test and Fisher's exact tests for statistical analysis. p values <0.05 were considered statistically significant. Results: Incidence of pulmonary hemorrhage was 19.6% (65/332) in non-coaxial group and 22.3% (71/318) in coaxial group. The difference in incidence between both groups was statistically insignificant (p = 0.27). Hemoptysis developed in 5.4% (18/332) and in 6.3% (20/318) in the non-coaxial and coaxial groups respectively. Traversing pulmonary vessels in the needle biopsy track was a significant risk factor of the development pulmonary hemorrhage (incidence: 55.4% (36/65, p = 0.0003) in the non-coaxial group and 57.7% (41/71, p = 0.0013) in coaxial group). Other significant risk factors included: lesions of less than 2 cm (p value of 0.01 and 0.02 in non-coaxial and coaxial groups respectively), basal and middle zonal lesions in comparison to upper zonal lung lesions (p = 0.002 and 0.03 in non-coaxial and coaxial groups respectively), increased lesion

  8. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  9. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  10. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  11. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  12. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  13. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  14. Factors Influencing Choice of Inguinal Hernia Repair Technique ...

    African Journals Online (AJOL)

    Background: Inguinal hernia repair surgery is one of the most frequently performed surgical procedures worldwide. This study sought to highlight factors that may influence decisions concerning inguinal hernia repair techniques. Methods: This descriptive crosssectional study was carried out in September 2014 among ...

  15. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  16. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  17. Phasor analysis of binary diffraction gratings with different fill factors

    International Nuclear Information System (INIS)

    MartInez, Antonio; Sanchez-Lopez, Ma del Mar; Moreno, Ignacio

    2007-01-01

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving power can be easily obtained without applying the usual Fourier transform operations required for these calculations. The proposed phasor technique is mathematically equivalent to the Fourier transform calculation of the diffraction order amplitude, and it can be useful to explain binary diffraction gratings in a simple manner in introductory physics courses. This theoretical analysis is illustrated with experimental results using a liquid crystal device to display diffraction gratings with different fill factors

  18. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  19. Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation

    Science.gov (United States)

    Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah

    2018-03-01

    To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.

  20. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  1. An experimental extrapolation technique using the Gafchromic EBT3 film for relative output factor measurements in small x-ray fields

    Energy Technology Data Exchange (ETDEWEB)

    Morales, Johnny E., E-mail: johnny.morales@lh.org.au [Department of Radiation Oncology, Chris O’Brien Lifehouse, 119-143 Missenden Road, Camperdown, NSW 2050, Australia and School of Chemistry, Physics, and Mechanical Engineering, Queensland University of Technology, Level 4 O Block, Garden’s Point, QLD 4001 (Australia); Butson, Martin; Hill, Robin [Department of Radiation Oncology, Chris O’Brien Lifehouse, 119-143 Missenden Road, Camperdown, NSW 2050, Australia and Institute of Medical Physics, University of Sydney, NSW 2006 (Australia); Crowe, Scott B. [School of Chemistry, Physics, and Mechanical Engineering, Queensland University of Technology, Level 4 O Block, Garden’s Point, QLD 4001, Australia and Cancer Care Services, Royal Brisbane and Women’s Hospital, Butterfield Street, Herston, QLD 4029 (Australia); Trapp, J. V. [School of Chemistry, Physics, and Mechanical Engineering, Queensland University of Technology, Level 4 O Block, Garden’s Point, QLD 4001 (Australia)

    2016-08-15

    Purpose: An experimental extrapolation technique is presented, which can be used to determine the relative output factors for very small x-ray fields using the Gafchromic EBT3 film. Methods: Relative output factors were measured for the Brainlab SRS cones ranging in diameters from 4 to 30 mm{sup 2} on a Novalis Trilogy linear accelerator with 6 MV SRS x-rays. The relative output factor was determined from an experimental reducing circular region of interest (ROI) extrapolation technique developed to remove the effects of volume averaging. This was achieved by scanning the EBT3 film measurements with a high scanning resolution of 1200 dpi. From the high resolution scans, the size of the circular regions of interest was varied to produce a plot of relative output factors versus area of analysis. The plot was then extrapolated to zero to determine the relative output factor corresponding to zero volume. Results: Results have shown that for a 4 mm field size, the extrapolated relative output factor was measured as a value of 0.651 ± 0.018 as compared to 0.639 ± 0.019 and 0.633 ± 0.021 for 0.5 and 1.0 mm diameter of analysis values, respectively. This showed a change in the relative output factors of 1.8% and 2.8% at these comparative regions of interest sizes. In comparison, the 25 mm cone had negligible differences in the measured output factor between zero extrapolation, 0.5 and 1.0 mm diameter ROIs, respectively. Conclusions: This work shows that for very small fields such as 4.0 mm cone sizes, a measureable difference can be seen in the relative output factor based on the circular ROI and the size of the area of analysis using radiochromic film dosimetry. The authors recommend to scan the Gafchromic EBT3 film at a resolution of 1200 dpi for cone sizes less than 7.5 mm and to utilize an extrapolation technique for the output factor measurements of very small field dosimetry.

  2. Analysis of the socio-economic factors associated with gum Arabic ...

    African Journals Online (AJOL)

    The study is an analysis of the socio-economic factors associated with gum arabic collectors in Northern Guinea Savanna Zone of Adamawa State, Nigeria through a questionnaire survey on a sample of 100 respondents obtained through a multi stage sampling technique. Data collected were analyzed using descriptive ...

  3. Factors Influencing Choice of Inguinal Hernia Repair Technique

    African Journals Online (AJOL)

    This study sought to highlight factors that may influence ... Experienced peers play a major role in training on the various repair ... Reasons influencing choice of repair technique include training ... (after appendectomy) accounting for 10 to 15% of all surgical ..... in medical school [as undergraduate students or as residents].

  4. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  5. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  6. Phasor analysis of binary diffraction gratings with different fill factors

    Energy Technology Data Exchange (ETDEWEB)

    MartInez, Antonio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain); Sanchez-Lopez, Ma del Mar [Instituto de BioingenierIa y Departamento de Fisica y Arquitectura de Computadores, Universidad Miguel Hernandez, 03202 Elche (Spain); Moreno, Ignacio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain)

    2007-09-11

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving power can be easily obtained without applying the usual Fourier transform operations required for these calculations. The proposed phasor technique is mathematically equivalent to the Fourier transform calculation of the diffraction order amplitude, and it can be useful to explain binary diffraction gratings in a simple manner in introductory physics courses. This theoretical analysis is illustrated with experimental results using a liquid crystal device to display diffraction gratings with different fill factors.

  7. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  8. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1991-01-01

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  9. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  10. An analysis of main factors in electron beam flue gas purification

    International Nuclear Information System (INIS)

    Zhang Ming; Xu Guang

    2003-01-01

    Electron beam flue gas purification method is developing very quickly in recent years. Based on the experiment setting for electron beam flue gas purification in Institute of Nuclear Energy and Technology, Tsinghua University, how the technique factors affect the ratio of desulphurization and denitrogenation are described. Radiation dose (D), temperature (T), humidity (H), pour ammonia quantity (α) and initial concentration of SO 2 (C SO 2 ) and NO x (C NO x ) are main factors influencing flue gas purification. Using the methods of correlation analysis and regression analysis, the primary effect factors are found out and the regression equations are set to optimize the system process, predigest the system structure and to forecast the experimental results. (authors)

  11. Factors that Affect Poverty Areas in North Sumatera Using Discriminant Analysis

    Science.gov (United States)

    Nasution, D. H.; Bangun, P.; Sitepu, H. R.

    2018-04-01

    In Indonesia, especially North Sumatera, the problem of poverty is one of the fundamental problems that become the focus of government both central and local government. Although the poverty rate decreased but the fact is there are many people who are poor. Poverty happens covers several aspects such as education, health, demographics, and also structural and cultural. This research will discuss about several factors such as population density, Unemployment Rate, GDP per capita ADHK, ADHB GDP per capita, economic growth and life expectancy that affect poverty in Indonesia. To determine the factors that most influence and differentiate the level of poverty of the Regency/City North Sumatra used discriminant analysis method. Discriminant analysis is one multivariate analysis technique are used to classify the data into a group based on the dependent variable and independent variable. Using discriminant analysis, it is evident that the factor affecting poverty is Unemployment Rate.

  12. FACTORS & ELEMENTAL ANALYSIS OF SIX THINKING HATS TECHNIQUE USING ABCD FRAMEWORK

    OpenAIRE

    Dr. P. S. Aithal; V. T. Shailashree; Dr. P. M. Suresh Kumar

    2017-01-01

    De Bono's Six Thinking Hats technique suggests different types of thinking corresponding to six thinking roles for the analyst, associated with hats of six different colors. The technique correlates different thinking styles used in a systematic problem solving procedure with different coloured hats. Alternately, by conceptualizing each type of hat, the person focuses on the style of thinking associated with each colour so that the problem can be analysed from different angles and frame of re...

  13. Identification of Key Success Factors in the Marketing of Cosmetics Based on Knowledge, Attitude and Practice (KAP Analysis Using Topsis Technique (The Case of Iran

    Directory of Open Access Journals (Sweden)

    Mehdi Mohammadzadeh, Shirin Hashemi, Faranak Salmannejad, Tayebeh Ghari

    2017-09-01

    Full Text Available Background: Cosmetic products are one of the most important fields of consumer market. Strategic marketing plan and creating competitive advantages through recognizing of key success factors has become as a main core competency of active firms in this area. Based on this, the aim of our study was to identify the key success factors of cosmetic products' marketing in the Iran's market. Methods: To do this, knowledge, attitude, and practice (KAP of consumers in Iran were evaluated and key success factors were identified based on the mix marketing theory. Deep interviews and closed-ended questionnaires were used to collect data. The randomized sample population of this study was 1200 people. Results of KAP analysis were classified in seven clusters and then Topsis technique was used to analysis each cluster. Results: Results showed that there are a significant relationship between attitude and practice and also between knowledge and practice because of t-values greater than 1.96 and path coefficient greater than 0.1. Moreover, the results indicated that the most and the least important factors for success of cosmetics' marketing are place (distribution and dispensing and price, with sorted Cli of 0.9 and 0.1 respectively. Conclusion: It demonstrates that appropriate sales and distribution strategies, scientific and enough information and strong marketing at the point of purchase are the most important key success factors in the marketing of cosmetics, and price has a minimum drawing effect on cosmetics' marketing.

  14. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  15. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  16. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  17. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  18. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  19. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  20. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  1. The influence of "C-factor" and light activation technique on polymerization contraction forces of resin composite

    Directory of Open Access Journals (Sweden)

    Sérgio Kiyoshi Ishikiriama

    2012-12-01

    Full Text Available OBJECTIVES: This study evaluated the influence of the cavity configuration factor ("C-Factor" and light activation technique on polymerization contraction forces of a Bis-GMA-based composite resin (Charisma, Heraeus Kulzer. MATERIAL AND METHODS: Three different pairs of steel moving bases were connected to a universal testing machine (emic DL 500: groups A and B - 2x2 mm (CF=0.33, groups C and D - 3x2 mm (CF=0.66, groups e and F - 6x2 mm (CF=1.5. After adjustment of the height between the pair of bases so that the resin had a volume of 12 mm³ in all groups, the material was inserted and polymerized by two different methods: pulse delay (100 mW/cm² for 5 s, 40 s interval, 600 mW/cm² for 20 s and continuous pulse (600 mW/cm² for 20 s. Each configuration was light cured with both techniques. Tensions generated during polymerization were recorded by 120 s. The values were expressed in curves (Force(N x Time(s and averages compared by statistical analysis (ANOVA and Tukey's test, p<0.05. RESULTS: For the 2x2 and 3x2 bases, with a reduced C-Factor, significant differences were found between the light curing methods. For 6x2 base, with high C-Factor, the light curing method did not influence the contraction forces of the composite resin. CONCLUSIONS: Pulse delay technique can determine less stress on tooth/restoration interface of adhesive restorations only when a reduced C-Factor is present.

  2. Investigations on the comparator technique used in epithermal neutron activation analysis

    International Nuclear Information System (INIS)

    Bereznai, T.; Bodizs, D.; Keoemley, G.

    1977-01-01

    The possible extension of the comparator technique of reactor neutron activation analysis into the field of epithermal neutron activation has been investigated. Ruthenium was used for multi-isotopic comparator. Experiments show that conversion of the so-called reference k-factors - determined by irradiation with reactor neutrons - into ksup(epi)-factors usable at activation under cadmium filter, can be evaluated with fair accuracy. Sources and extent of errors and their contribution to the final error of analysis are discussed. For equal irradiation and counting times advantage of ENAA for several elements is obvious: the much lower background activity permitted the sample to be measured closer to the detector, under better geometry conditions, consequently, permitting several elements to be determined quantitatively. The number of elements determined and the sensitivity of the method are much dependent on the experimental conditions, especially on the composition of the sample, on the PHIsub(e) value, the irradiation time and the efficiency of the Ge(Li) detector. (T.G.)

  3. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  4. Analysis of Defective Pipings in Nuclear Power Plants and Applications of Guided Ultrasonic Wave Techniques

    International Nuclear Information System (INIS)

    Koo, Dae Seo; Cheong, Yong Moo; Jung, Hyun Kyu; Park, Chi Seung; Park, Jae Suck; Choi, H. R.; Jung, S. S.

    2006-07-01

    In order to apply the guided ultrasonic techniques to the pipes in nuclear power plants, the cases of defective pipes of nuclear power plants, were investigated. It was confirmed that geometric factors of pipes, such as location, shape, and allowable space were impertinent for the application of guided ultrasonic techniques to pipes of nuclear power plants. The quality of pipes, supports, signals analysis of weldment/defects, acquisition of accurate defects signals also make difficult to apply the guided ultrasonic techniques to pipes of nuclear power plants. Thus, a piping mock-up representing the pipes in the nuclear power plants were designed and fabricated. The artificial flaws will be fabricated on the piping mock-up. The signals of guided ultrasonic waves from the artificial flaws will be analyzed. The guided ultrasonic techniques will be applied to the inspection of pipes of nuclear power plants according to the basis of signals analysis of artificial flaws in the piping mock-up

  5. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  6. Radio immune analysis of endocrine system state at exposure to different physical factors

    International Nuclear Information System (INIS)

    Rozdyil's'ka, O.M.; Rozdtil's'kij, S.Yi.; Bratchuk, O.M.; Yakimova, T.P.

    1999-01-01

    Radio immune analysis allows to define more exactly the mechanisms of physical factors action and may be used for validation of new therapeutic techniques. The ultra-high frequency magneto therapy, sinusoid modulated current and direct current influence the state of the endocrine system differently but within the physiological norm. The most prominent action of the physical factor is observed at trans cerebral mode

  7. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  8. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    Nigran, K.S.; Barber, D.C.

    1985-01-01

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  9. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  10. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    Science.gov (United States)

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  12. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  13. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  14. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  15. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  16. The Analysis of Factors Influencing Effectivenes of Property Taxes in Karanganyar Regency

    Directory of Open Access Journals (Sweden)

    Endang Brotojoyo

    2018-03-01

    Full Text Available The purpose of this study was to test empirically Effect of Compensation, Motivation and External Factors To Performance Officer With Property Taxes Voting in the District Effectiveness Matesih Karanganyar. The analysis technique used is using validity and reliability test, linearity test, regression analysis, path analysis, t test, F test, test the coefficient of determination and correlation analysis. Compensation Hypothesis Test Results significantly influence the effectiveness of tax collection. Motivation significantly influences the effectiveness of tax collection. External factors do not significant effect on effectiveness of tax collection. Compensation significant effect on the performance of Officers. Motivation significant effect on the performance of the Property Taxes polling clerk. External factors do not significant effect on the performance of Officers. Effectiveness of tax collection clerk significant effects on performance. F test results can be concluded jointly variable compensation, motivation, and external factors affecting the effectiveness of tax collection performance. The R2 total of 0,974 means that the performance of the Property Taxes in the district polling officer Matesih Karanganyar explained by the variable compensation, motivation, external factors and the effectiveness of tax collection amounted to 97.4%. The results of path analysis showed that the effective compensation and motivation through a direct path, while external factors are not effective for direct and indirect pathways.

  17. Evaluation of landslide susceptibility mapping techniques using lidar-derived conditioning factors (Oregon case study

    Directory of Open Access Journals (Sweden)

    Rubini Mahalingam

    2016-11-01

    Full Text Available Landslides are a significant geohazard, which frequently result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping using GIS and remote sensing can help communities prepare for these damaging events. Current mapping efforts utilize a wide variety of techniques and consider multiple factors. Unfortunately, each study is relatively independent of others in the applied technique and factors considered, resulting in inconsistencies. Further, input data quality often varies in terms of source, data collection, and generation, leading to uncertainty. This paper investigates if lidar-derived data-sets (slope, slope roughness, terrain roughness, stream power index, and compound topographic index can be used for predictive mapping without other landslide conditioning factors. This paper also assesses the differences in landslide susceptibility mapping using several, widely used statistical techniques. Landslide susceptibility maps were produced from the aforementioned lidar-derived data-sets for a small study area in Oregon using six representative statistical techniques. Most notably, results show that only a few factors were necessary to produce satisfactory maps with high predictive capability (area under the curve >0.7. The sole use of lidar digital elevation models and their derivatives can be used for landslide mapping using most statistical techniques without requiring additional detailed data-sets that are often difficult to obtain or of lower quality.

  18. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  19. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  20. Experimental device for obtaining calibration factor for the total count technique

    International Nuclear Information System (INIS)

    Gonçalves, Eduardo R.; Braz, Delson; Brandão, Luís Eduardo B.

    2017-01-01

    Nuclear technologies have widely used on industry plants in order to help to solve troubles processes/design or just obtain information of them. The Total Count technique for flow measurement has as main advantages: being an absolute technique, because it is independent of additional devices readings unless the directly used for recording the radioactive cloud, requiring only a single detector to provide the final result; the independence of the internal volume of the transport duct, can be applied in the presence or absence of obstructions; no restriction as to the nature of the product or material to be conveyed; it is a noninvasive technique which allows real-time diagnostics. To use Total Count Technique, knowledge of a geometric calibration factor is required. Called Factor F, it is obtained in the laboratory using an experimental apparatus to faithfully reproduce the geometry of the detection system and the pipeline that being analyzed and using the same radiotracer, therefore, its value is constant for each specific measuring system under survey. This experimental apparatus for obtaining the factor F consisting by a pipe of 2 ″PVC, which simulates a transmission line, where they were deposited 500 ml oil and the use of a specific pipette for use viscous fluids were added sequentially aliquots (50.00 ± 0.01) μl radiotracer (radionuclide photopeak energy of 198 Au 411.8 keV) and analyzing data obtained by three distinct detection systems composed of detectors NaI scintillators 1″ x 1 ″ and a data acquisition system. (author)

  1. Experimental device for obtaining calibration factor for the total count technique

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Eduardo R.; Braz, Delson [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear; Brandão, Luís Eduardo B. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Divisao de Reatores

    2017-07-01

    Nuclear technologies have widely used on industry plants in order to help to solve troubles processes/design or just obtain information of them. The Total Count technique for flow measurement has as main advantages: being an absolute technique, because it is independent of additional devices readings unless the directly used for recording the radioactive cloud, requiring only a single detector to provide the final result; the independence of the internal volume of the transport duct, can be applied in the presence or absence of obstructions; no restriction as to the nature of the product or material to be conveyed; it is a noninvasive technique which allows real-time diagnostics. To use Total Count Technique, knowledge of a geometric calibration factor is required. Called Factor F, it is obtained in the laboratory using an experimental apparatus to faithfully reproduce the geometry of the detection system and the pipeline that being analyzed and using the same radiotracer, therefore, its value is constant for each specific measuring system under survey. This experimental apparatus for obtaining the factor F consisting by a pipe of 2 ″PVC, which simulates a transmission line, where they were deposited 500 ml oil and the use of a specific pipette for use viscous fluids were added sequentially aliquots (50.00 ± 0.01) μl radiotracer (radionuclide photopeak energy of 198 Au 411.8 keV) and analyzing data obtained by three distinct detection systems composed of detectors NaI scintillators 1″ x 1 ″ and a data acquisition system. (author)

  2. Multivariate analysis of structure and contribution per shares made by potential risk factors at malignant neoplasms in trachea, bronchial tubes and lung

    Directory of Open Access Journals (Sweden)

    G.T. Aydinov

    2017-03-01

    Full Text Available The article gives the results of multivariate analysis of structure and contribution per shares made by potential risk factors at malignant neoplasms in trachea, bronchial tubes and lung. The authors used specialized databases comprising personified records on oncologic diseases in Taganrog, Rostov region, over 1986-2015 (30,684 registered cases of malignant neoplasms, including 3,480 cases of trachea cancer, bronchial tubes cancer, and lung cancer. When carrying out analytical research we applied both multivariate statistical techniques (factor analysis and hierarchical cluster correlation analysis and conventional techniques of epidemiologic analysis including etiologic fraction calculation (EF, as well as an original technique of assessing actual (epidemiologic risk. Average long-term morbidity with trachea, bronchial tubes and lung cancer over 2011-2015 amounts to 46.64 o / oooo . Over the last 15 years a stable decreasing trend has formed, annual average growth being – 1.22 %. This localization holds the 3rd rank place in oncologic morbidity structure, its specific weight being 10.02 %. We determined etiological fraction (EF for smoking as a priority risk factor causing trachea, bronchial tubes and lung cancer; this fraction amounts to 76.19 % for people aged 40 and older, and to 81.99 % for those aged 60 and older. Application of multivariate statistical techniques (factor analysis and cluster correlation analysis in this research enabled us to make factor structure more simple; namely, to highlight, interpret, give a quantitative estimate of self-descriptiveness and rank four group (latent potential risk factors causing lung cancer.

  3. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    Science.gov (United States)

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  4. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    Directory of Open Access Journals (Sweden)

    Hero Alfred

    2010-11-01

    Full Text Available Abstract Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP, the Indian Buffet Process (IBP, and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV, Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD, closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  5. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    Science.gov (United States)

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  6. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  7. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  8. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  10. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  11. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  12. Exploratory Bi-factor Analysis: The Oblique Case

    OpenAIRE

    Jennrich, Robert L.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bi-factor rotation criterion designed to produce a rotated loading mat...

  13. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  14. MMOSA – A new approach of the human and organizational factor analysis in PSA

    International Nuclear Information System (INIS)

    Farcasiu, M.; Prisecaru, I.

    2014-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to nuclear installations failure. This paper is intended to analyze both the human performance importance in PSA studies and the elements that influence it. Starting from Man–Machine–Organization System (MMOS) concept a new approach (MMOSA) was developed to allow an explicit incorporation of the human and organizational factor in PSA studies. This method uses old techniques from Human Reliability Analysis (HRA) methods (THERP, SPAR-H) and new techniques to analyze human performance. The main novelty included in MMOSA is the identification of the machine–organization interfaces (maintenance, modification and aging management plan and state of man–machine interface) and the human performance evaluation based on them. A detailed result of the Human Performance Analysis (HPA) using the MMOSA methodology can identify any serious deficiencies of human performance which can usually be corrected through the improvement of the related MMOS interfaces. - Highlights: • MMOSA allows the incorporation of the human and organizational factor in PSA. • The method uses old techniques and new techniques to analyze human performance. • The main novelty is the identification of the machine–organization interfaces. • The MMOSA methodology identifies any serious deficiencies which can be corrected

  15. Driven Factors Analysis of China’s Irrigation Water Use Efficiency by Stepwise Regression and Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Renfu Jia

    2016-01-01

    Full Text Available This paper introduces an integrated approach to find out the major factors influencing efficiency of irrigation water use in China. It combines multiple stepwise regression (MSR and principal component analysis (PCA to obtain more realistic results. In real world case studies, classical linear regression model often involves too many explanatory variables and the linear correlation issue among variables cannot be eliminated. Linearly correlated variables will cause the invalidity of the factor analysis results. To overcome this issue and reduce the number of the variables, PCA technique has been used combining with MSR. As such, the irrigation water use status in China was analyzed to find out the five major factors that have significant impacts on irrigation water use efficiency. To illustrate the performance of the proposed approach, the calculation based on real data was conducted and the results were shown in this paper.

  16. Human cost as a factor used in the cost-benefit analysis

    Directory of Open Access Journals (Sweden)

    Karol RÁSTOČNÝ

    2008-01-01

    Full Text Available Cost-benefit analysis (CBA is a prescriptive technique that is performed for the purpose of informing policy makers about what they ought to do. The paper discusses the problem of assigning a monetary value to human life (lifesaving or quality of life as an important factor used in the CBA. Presented ideas come from the project SELCAT solved within the 6th Frame Program.

  17. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  18. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  19. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  20. Limitations of transient power loads on DEMO and analysis of mitigation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Maviglia, F., E-mail: francesco.maviglia@euro-fusion.org [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Federici, G. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Strohmayer, G. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Wenninger, R. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Bachmann, C. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Albanese, R. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Ambrosino, R. [Consorzio CREATE University Napoli Parthenope, Naples (Italy); Li, M. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Loschiavo, V.P. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); You, J.H. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Zani, L. [CEA, IRFM, F-13108 St Paul-Lez-Durance (France)

    2016-11-01

    Highlights: • A parametric thermo-hydraulic analysis of the candidate DEMO divertor is presented. • The operational space assessment is presented under static and transient heat loads. • Strike points sweeping is analyzed as a divertor power exhaust mitigation technique. • Results are presented on sweeping installed power required, AC losses and thermal fatigue. - Abstract: The present European standard DEMO divertor target technology is based on a water-cooled tungsten mono-block with a copper alloy heat sink. This paper presents the assessment of the operational space of this technology under static and transient heat loads. A transient thermo-hydraulic analysis was performed using the code RACLETTE, which allowed a broad parametric scan of the target geometry and coolant conditions. The limiting factors considered were the coolant critical heat flux (CHF), and the temperature limits of the materials. The second part of the work is devoted to the study of the plasma strike point sweeping as a mitigation technique for the divertor power exhaust. The RACLETTE code was used to evaluate the impact of a large range of sweeping frequencies and amplitudes. A reduced subset of cases, which complied with the constraints, was benchmarked with a 3D FEM model. A reduction of the heat flux to the coolant, up to a factor ∼4, and lower material temperatures were found for an incident heat flux in the range (15–30) MW/m{sup 2}. Finally, preliminary assessments were performed on the installed power required for the sweeping, the AC losses in the superconductors and thermal fatigue analysis. No evident show stoppers were found.

  1. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  2. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  3. Recurrent tricuspid insufficiency: is the surgical repair technique a risk factor?

    Science.gov (United States)

    Kara, Ibrahim; Koksal, Cengiz; Cakalagaoglu, Canturk; Sahin, Muslum; Yanartas, Mehmet; Ay, Yasin; Demir, Serdar

    2013-01-01

    This study compares the medium-term results of De Vega, modified De Vega, and ring annuloplasty techniques for the correction of tricuspid insufficiency and investigates the risk factors for recurrent grades 3 and 4 tricuspid insufficiency after repair. In our clinic, 93 patients with functional tricuspid insufficiency underwent surgical tricuspid repair from May 2007 through October 2010. The study was retrospective, and all the data pertaining to the patients were retrieved from hospital records. Functional capacity, recurrent tricuspid insufficiency, and risk factors aggravating the insufficiency were analyzed for each patient. In the medium term (25.4 ± 10.3 mo), the rates of grades 3 and 4 tricuspid insufficiency in the De Vega, modified De Vega, and ring annuloplasty groups were 31%, 23.1%, and 6.1%, respectively. Logistic regression analysis revealed that chronic obstructive pulmonary disease, left ventricular dysfunction (ejection fraction, tricuspid insufficiency. Medium-term survival was 90.6% for the De Vega group, 96.3% for the modified De Vega group, and 97.1% for the ring annuloplasty group. Ring annuloplasty provided the best relief from recurrent tricuspid insufficiency when compared with DeVega annuloplasty. Modified De Vega annuloplasty might be a suitable alternative to ring annuloplasty when rings are not available.

  4. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  5. Development Instruments Through Confirmatory Factor Analysis (CFA in Appropriate Intensity Assessment

    Directory of Open Access Journals (Sweden)

    Ari Saptono

    2017-06-01

    Full Text Available The research aims to develop the valid and reliable measurement instruments of entrepreneurship intention in vocational secondary school students. Multi stage random sampling was used as the technique to determine sample (300 respondents. The research method used research and development with confirmatory factor analysis (CFA. Result of confirmatory factor analysis (CFA at the second order with robust maximum likelihood method shows that valid and reliable instrument with the acquisition value of loading factor is more than 0.5 (> 0,5 and a significance value of t is more than 1,96 (> 1,96. Reliability test results shows that the value of the combined construct reliability (CR of 0.97and a variance value extract (VE to 0.52 is greater than the limit of acceptance CR ≥ 0.70 and VE ≥ 0.50. The conclusion of the measurement instruments of entrepreneurship intention with three dimensions and 31 items met the standards of validity and reliability in accordance with the instrument development process.

  6. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  7. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  8. Anatomy-based transmission factors for technique optimization in portable chest x-ray

    Science.gov (United States)

    Liptak, Christopher L.; Tovey, Deborah; Segars, William P.; Dong, Frank D.; Li, Xiang

    2015-03-01

    Portable x-ray examinations often account for a large percentage of all radiographic examinations. Currently, portable examinations do not employ automatic exposure control (AEC). To aid in the design of a size-specific technique chart, acrylic slabs of various thicknesses are often used to estimate x-ray transmission for patients of various body thicknesses. This approach, while simple, does not account for patient anatomy, tissue heterogeneity, and the attenuation properties of the human body. To better account for these factors, in this work, we determined x-ray transmission factors using computational patient models that are anatomically realistic. A Monte Carlo program was developed to model a portable x-ray system. Detailed modeling was done of the x-ray spectrum, detector positioning, collimation, and source-to-detector distance. Simulations were performed using 18 computational patient models from the extended cardiac-torso (XCAT) family (9 males, 9 females; age range: 2-58 years; weight range: 12-117 kg). The ratio of air kerma at the detector with and without a patient model was calculated as the transmission factor. Our study showed that the transmission factor decreased exponentially with increasing patient thickness. For the range of patient thicknesses examined (12-28 cm), the transmission factor ranged from approximately 21% to 1.9% when the air kerma used in the calculation represented an average over the entire imaging field of view. The transmission factor ranged from approximately 21% to 3.6% when the air kerma used in the calculation represented the average signals from two discrete AEC cells behind the lung fields. These exponential relationships may be used to optimize imaging techniques for patients of various body thicknesses to aid in the design of clinical technique charts.

  9. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  10. Recurrent Tricuspid Insufficiency: Is the Surgical Repair Technique a Risk Factor?

    OpenAIRE

    Kara, Ibrahim; Koksal, Cengiz; Cakalagaoglu, Canturk; Sahin, Muslum; Yanartas, Mehmet; Ay, Yasin; Demir, Serdar

    2013-01-01

    This study compares the medium-term results of De Vega, modified De Vega, and ring annuloplasty techniques for the correction of tricuspid insufficiency and investigates the risk factors for recurrent grades 3 and 4 tricuspid insufficiency after repair.

  11. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  12. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  13. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  14. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Chang [CancerCare Manitoba, Winnipeg, MB (Canada); Jans, Hans; McEwan, Sandy; Riauka, Terence [Department of Oncology, University of Alberta, Edmonton, AB (Canada); Cross Cancer Institute, Alberta Health Services, Edmonton, AB (Canada); Martin, Wayne; Wieler, Marguerite [Division of Neurology, University of Alberta, Edmonton, AB (Canada)

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively. The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.

  15. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  16. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  17. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  18. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  19. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  20. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  1. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  2. Improved streaming analysis technique: spherical harmonics expansion of albedo data

    International Nuclear Information System (INIS)

    Albert, T.E.; Simmons, G.L.

    1979-01-01

    An improved albedo scattering technique was implemented with a three-dimensional Monte Carlo transport code for use in analyzing radiation streaming problems. The improvement was based on a shifted spherical Harmonics expansion of the doubly differential albedo data base. The result of the improvement was a factor of 3 to 10 reduction in data storage requirements and approximately a factor of 3 to 6 increase in computational speed. Comparisons of results obtained using the technique with measurements are shown for neutron streaming in one- and two-legged square concrete ducts

  3. Determination of true coincidence correction factors using Monte-Carlo simulation techniques

    Directory of Open Access Journals (Sweden)

    Chionis Dionysios A.

    2014-01-01

    Full Text Available Aim of this work is the numerical calculation of the true coincidence correction factors by means of Monte-Carlo simulation techniques. For this purpose, the Monte Carlo computer code PENELOPE was used and the main program PENMAIN was properly modified in order to include the effect of the true coincidence phenomenon. The modified main program that takes into consideration the true coincidence phenomenon was used for the full energy peak efficiency determination of an XtRa Ge detector with relative efficiency 104% and the results obtained for the 1173 keV and 1332 keV photons of 60Co were found consistent with respective experimental ones. The true coincidence correction factors were calculated as the ratio of the full energy peak efficiencies was determined from the original main program PENMAIN and the modified main program PENMAIN. The developed technique was applied for 57Co, 88Y, and 134Cs and for two source-to-detector geometries. The results obtained were compared with true coincidence correction factors calculated from the "TrueCoinc" program and the relative bias was found to be less than 2%, 4%, and 8% for 57Co, 88Y, and 134Cs, respectively.

  4. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  5. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  6. Paediatric lower limb deformity correction using the Ilizarov technique: a statistical analysis of factors affecting the complication rate.

    Science.gov (United States)

    Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M

    2014-01-01

    Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.

  7. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  8. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  9. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    International Nuclear Information System (INIS)

    Belushkin, M.

    2007-01-01

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the ππ, K anti K and the ρπ continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  10. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    kidding and parity on common fac-  tors, while no differences were found between goats with one or more kids. The multivariate factor analysis technique  was effective in describing the quality of Girgentana milk with a low number of new latent variables. These new variables  have been useful in the study of the effect of some technical factors such as parity and season of kidding on the quan-  titative and qualitative aspects of milk production in this goat breed. 

  11. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    Science.gov (United States)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  12. Layer-splitting technique for testing the recursive scheme for multilayer shields gamma ray buildup factors

    International Nuclear Information System (INIS)

    Alkhatib, Sari F.; Park, Chang Je; Jeong, Hae Yong; Lee, Yongdeok

    2016-01-01

    Highlights: • A simple formalism is suggested for the recursive approach and then it is used to produce buildup factors for certain multilayer shields. • The newly layer-splitting technique is implemented on the studied cases for testing the suggested formalism performance. • The buildup factors are generated using cubic polynomial fitting functions that are produced based on previous well-acknowledge data. - Abstract: This study illustrates the implementation of the newly suggested layer-splitting testing technique. This technique is introduced in order to be implemented in examining suggested formalisms for the recursive scheme (or iterative scheme). The recursive scheme is a concept used in treating and producing the gamma ray buildup factors in the case of multilayer shields. The layer-splitting technique simply enforces the scheme to treat a single layer of one material as two separated layers with similar characteristics. Thus it subjects the scheme to an abnormal definition of the multilayer shield that will test its performance in treating the successive layers. Thus, it will act as a method of verification for the approximations and assumptions taken in consideration. A simple formalism was suggested for the recursive scheme then the splitting technique was implemented on it. The results of implementing both the suggested formalism and the splitting technique are then illustrated and discussed. Throughout this study, cubic polynomial fitting functions were used to generate the data of buildup factors for the basic single-media that constitute the multilayer shields understudy. This study is limited to the cases of multiple shields consisting of repeated consecutive thin layers of lead–water and iron–water shields for 1 MeV gamma rays. The produced results of the buildup factor values through the implementation of the suggested formalism showed good consistency with the Monte Carlo simulation results of Lin and Jiang work. In the implementation of

  13. Decomposition techniques

    Science.gov (United States)

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  14. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  15. Scaling Transformation in the Rembrandt Technique

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; Leleur, Steen

    2013-01-01

    This paper examines a decision support system (DSS) for the appraisal of complex decision problems using multi-criteria decision analysis (MCDA). The DSS makes use of a structured hierarchical approach featuring the multiplicative AHP also known as the REMBRANDT technique. The paper addresses...... of a conventional AHP calculation in order to examine what impact the choice of progression factors as well as the choice of technique have on the decision making. Based on this a modified progression factor for the calculation of scores for the alternatives in REMBRANDT is suggested while the progression factor...

  16. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    International Nuclear Information System (INIS)

    Shen, Chen-Hua

    2015-01-01

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  17. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Chen-Hua, E-mail: shenandchen01@163.com [College of Geographical Science, Nanjing Normal University, Nanjing 210046 (China); Jiangsu Center for Collaborative Innovation in Geographical Information Resource, Nanjing 210046 (China); Key Laboratory of Virtual Geographic Environment of Ministry of Education, Nanjing 210046 (China)

    2015-12-04

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  18. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  19. [Analysis of virulence factors of Porphyromonas endodontalis based on comparative proteomics technique].

    Science.gov (United States)

    Li, H; Ji, H; Wu, S S; Hou, B X

    2016-12-09

    Objective: To analyze the protein expression profile and the potential virulence factors of Porphyromonas endodontalis (Pe) via comparison with that of two strains of Porphyromonas gingivalis (Pg) with high and low virulences, respectively. Methods: Whole cell comparative proteomics of Pe ATCC35406 was examined and compared with that of high virulent strain Pg W83 andlow virulent strain Pg ATCC33277, respectively. Isobaric tags for relative and absolute quantitation (iTRAQ) combined with nano liquid chromatography-tandem mass spectrometry (Nano-LC-MS/MS) were adopted to identify and quantitate the proteins of Pe and two strains of Pg with various virulences by using the methods of isotopically labeled peptides, mass spectrometric detection and bioinformatics analysis. The biological functions of similar proteins expressed by Pe ATCC35406 and two strains of Pg were quantified and analyzed. Results: Totally 1 210 proteins were identified while Pe compared with Pg W83. There were 130 proteins (10.74% of the total proteins) expressed similarly, including 89 known functional proteins and 41 proteins of unknown functions. Totally 1 223 proteins were identified when Pe compared with Pg ATCC33277. There were 110 proteins (8.99% of the total proteins) expressed similarly, including 72 known functional proteins and 38 proteins of unknown functions. The similarly expressed proteins in Pe and Pg strains with various virulences mainly focused on catalytic activity and binding function, including recombination activation gene (RagA), lipoprotein, chaperonin Dnak, Clp family proteins (ClpC and ClpX) and various iron-binding proteins. They were involved in metabolism and cellular processes. In addition, the type and number of similar virulence proteins between Pe and high virulence Pg were higher than those between Pe and low virulence Pg. Conclusions: Lipoprotein, oxygen resistance protein, iron binding protein were probably the potential virulence factors of Pe ATCC35406. It was

  20. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  1. TECHNIQUES AND FACTORS CONTRIBUTING TO DEVELOPING CRITICAL THINKING SKILLS

    Directory of Open Access Journals (Sweden)

    Irina Vladimirovna Glukhova

    2015-01-01

    Full Text Available The paper deals with the issue of working out and introduction in educational process of higher educational institutions of the innovative technology for developing skills of critical thinking skills of the future specialists. Research is aimed at revealing of the factors promoting formation of students’ critical thinking in high schools; the search of strategy and the receptions actualizing creative abilities of students and helping to formation of an active, independent person. The author gives the reasoned proving that it’s necessary to set up the creative educational environment and adjustment of positive dialogue between the teacher and the trainee for education of such person, development of abilities of an objective reflection, interpretation of the phenomena, formulations of adequate conclusions, well-founded evaluating. Methods. The methods involve the analysis of the philosophical, psychology-pedagogical, methodical literature and the scientific periodical publications; generalisation of the Russian and foreign background, classification and arrangement of the considered issues, supervision. Results. Current approaches to the rendering of critical thinking and a problem of its formation in the scientific literature are considered; the concept «the creative educational environment» is specified; the ways of increasing the educational process efficiency are shown. Scientific novelty. The complex of procedures and the conditions promoting effective development of critical thinking skills is theoretically proved on the basis of the analysis of various information sources. Practical significance. The research outcomes and the recommended methods of critical thinking skills formation can be useful for the professors and lecturers of higher education institutions to optimize subject matter selection, techniques and methods of education under the conditions of dynamically updated educational process. 

  2. Ranking factors involved in product design using a hybrid model of Quality Function Deployment, Data Envelopment Analysis and TOPSIS technique

    Directory of Open Access Journals (Sweden)

    Davood Feiz

    2014-08-01

    Full Text Available Quality function deployment (QFD is one such extremely important quality management tool, which is useful in product design and development. Traditionally, QFD rates the design requirements (DRs with respect to customer requirements, and aggregates the rating to get relative importance score of DRs. An increasing number of studies emphasize on the need to incorporate additional factors, such as cost and environmental impact, while calculating the relative importance of DRs. However, there are different methodologies for driving the relative importance of DRs, when several additional factors are considered. TOPSIS (technique for order preferences by similarity to ideal solution is suggested for the purpose of the research. This research proposes new approach of TOPSIS for considering the rating of DRs with respect to CRs, and several additional factors, simultaneously. Proposed method is illustrated using by step-by-step procedure. The proposed methodology was applied for the Sanam Electronic Company in Iran.

  3. Calculation of the flux attenuation and multiple scattering correction factors in time of flight technique for double differential cross section measurements

    International Nuclear Information System (INIS)

    Martin, G.; Coca, M.; Capote, R.

    1996-01-01

    Using Monte Carlo method technique , a computer code which simulates the time of flight experiment to measure double differential cross section was developed. The correction factor for flux attenuation and multiple scattering, that make a deformation to the measured spectrum, were calculated. The energy dependence of the correction factor was determined and a comparison with other works is shown. Calculations for Fe 56 at two different scattering angles were made. We also reproduce the experiment performed at the Nuclear Analysis Laboratory for C 12 at 25 celsius degree and the calculated correction factor for the is measured is shown. We found a linear relation between the scatter size and the correction factor for flux attenuation

  4. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  5. Impact of pre-analytical factors on the proteomic analysis of formalin-fixed paraffin-embedded tissue.

    Science.gov (United States)

    Thompson, Seonaid M; Craven, Rachel A; Nirmalan, Niroshini J; Harnden, Patricia; Selby, Peter J; Banks, Rosamonde E

    2013-04-01

    Formalin-fixed paraffin-embedded (FFPE) tissue samples represent a tremendous potential resource for biomarker discovery, with large numbers of samples in hospital pathology departments and links to clinical information. However, the cross-linking of proteins and nucleic acids by formalin fixation has hampered analysis and proteomic studies have been restricted to using frozen tissue, which is more limited in availability as it needs to be collected specifically for research. This means that rare disease subtypes cannot be studied easily. Recently, improved extraction techniques have enabled analysis of FFPE tissue by a number of proteomic techniques. As with all clinical samples, pre-analytical factors are likely to impact on the results obtained, although overlooked in many studies. The aim of this review is to discuss the various pre-analytical factors, which include warm and cold ischaemic time, size of sample, fixation duration and temperature, tissue processing conditions, length of storage of archival tissue and storage conditions, and to review the studies that have considered these factors in more detail. In those areas where investigations are few or non-existent, illustrative examples of the possible importance of specific factors have been drawn from studies using frozen tissue or from immunohistochemical studies of FFPE tissue. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  7. Exploratory Bi-Factor Analysis: The Oblique Case

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  8. Factor Analysis of Drawings: Application to college student models of the greenhouse effect

    Science.gov (United States)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-09-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance, suggesting that 4 archetype models of the greenhouse effect dominate thinking within this population. Factor scores, indicating the extent to which each student's drawing aligned with representative models, were compared to performance on conceptual understanding and attitudes measures, demographics, and non-cognitive features of drawings. Student drawings were also compared to drawings made by scientists to ascertain the extent to which models reflect more sophisticated and accurate models. Results indicate that student and scientist drawings share some similarities, most notably the presence of some features of the most sophisticated non-scientific model held among the study population. Prior knowledge, prior attitudes, gender, and non-cognitive components are also predictive of an individual student's model. This work presents a new technique for analyzing drawings, with general implications for the use of drawings in investigating student conceptions.

  9. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  10. Nonlinear analysis techniques of block masonry walls in nuclear power plants

    International Nuclear Information System (INIS)

    Hamid, A.A.; Harris, H.G.

    1986-01-01

    Concrete masonry walls have been used extensively in nuclear power plants as non-load bearing partitions serving as pipe supports, fire walls, radiation shielding barriers, and similar heavy construction separations. When subjected to earthquake loads, these walls should maintain their structural integrity. However, some of the walls do not meet design requirements based on working stress allowables. Consequently, utilities have used non-linear analysis techniques, such as the arching theory and the energy balance technique, to qualify such walls. This paper presents a critical review of the applicability of non-linear analysis techniques for both unreinforced and reinforced block masonry walls under seismic loading. These techniques are critically assessed in light of the performance of walls from limited available test data. It is concluded that additional test data are needed to justify the use of nonlinear analysis techniques to qualify block walls in nuclear power plants. (orig.)

  11. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  12. A noise reduction technique based on nonlinear kernel function for heart sound analysis.

    Science.gov (United States)

    Mondal, Ashok; Saxena, Ishan; Tang, Hong; Banerjee, Poulami

    2017-02-13

    The main difficulty encountered in interpretation of cardiac sound is interference of noise. The contaminated noise obscures the relevant information which are useful for recognition of heart diseases. The unwanted signals are produced mainly by lungs and surrounding environment. In this paper, a novel heart sound de-noising technique has been introduced based on a combined framework of wavelet packet transform (WPT) and singular value decomposition (SVD). The most informative node of wavelet tree is selected on the criteria of mutual information measurement. Next, the coefficient corresponding to the selected node is processed by SVD technique to suppress noisy component from heart sound signal. To justify the efficacy of the proposed technique, several experiments have been conducted with heart sound dataset, including normal and pathological cases at different signal to noise ratios. The significance of the method is validated by statistical analysis of the results. The biological information preserved in de-noised heart sound (HS) signal is evaluated by k-means clustering algorithm and Fit Factor calculation. The overall results show that proposed method is superior than the baseline methods.

  13. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  14. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  15. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  16. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  18. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  19. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  20. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  1. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  2. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  3. Clinical education and training: Using the nominal group technique in research with radiographers to identify factors affecting quality and capacity

    International Nuclear Information System (INIS)

    Williams, P.L.; White, N.; Klem, R.; Wilson, S.E.; Bartholomew, P.

    2006-01-01

    There are a number of group-based research techniques available to determine the views or perceptions of individuals in relation to specific topics. This paper reports on one method, the nominal group technique (NGT) which was used to collect the views of important stakeholders on the factors affecting the quality of, and capacity to provide clinical education and training in diagnostic imaging and radiotherapy and oncology departments in the UK. Inclusion criteria were devised to recruit learners, educators, practitioners and service managers to the nominal groups. Eight regional groups comprising a total of 92 individuals were enrolled; the numbers in each group varied between 9 and 13. A total of 131 items (factors) were generated across the groups (mean = 16.4). Each group was then asked to select the top three factors from their original list. Consensus on the important factors amongst groups found that all eight groups agreed on one item: staff attitude, motivation and commitment to learners. The 131 items were organised into themes using content analysis. Five main categories and a number of subcategories emerged. The study concluded that the NGT provided data which were congruent with the issues faced by practitioners and learners in their daily work; this was of vital importance if the findings are to be regarded with credibility. Further advantages and limitations of the method are discussed, however it is argued that the NGT is a useful technique to gather relevant opinion; to select priorities and to reach consensus on a wide range of issues

  4. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  5. A computational intelligent approach to multi-factor analysis of violent crime information system

    Science.gov (United States)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  6. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  7. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  8. Using Human Factors Techniques to Design Text Message Reminders for Childhood Immunization

    Science.gov (United States)

    Ahlers-Schmidt, Carolyn R.; Hart, Traci; Chesser, Amy; Williams, Katherine S.; Yaghmai, Beryl; Shah-Haque, Sapna; Wittler, Robert R.

    2012-01-01

    This study engaged parents to develop concise, informative, and comprehensible text messages for an immunization reminder system using Human Factors techniques. Fifty parents completed a structured interview including demographics, technology questions, willingness to receive texts from their child's doctor, and health literacy. Each participant…

  9. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    Science.gov (United States)

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  11. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  12. Analysis of psychological factors for quality assessment of interactive multimodal service

    Science.gov (United States)

    Yamagishi, Kazuhisa; Hayashi, Takanori

    2005-03-01

    We proposed a subjective quality assessment model for interactive multimodal services. First, psychological factors of an audiovisual communication service were extracted by using the semantic differential (SD) technique and factor analysis. Forty subjects participated in subjective tests and performed point-to-point conversational tasks on a PC-based TV phone that exhibits various network qualities. The subjects assessed those qualities on the basis of 25 pairs of adjectives. Two psychological factors, i.e., an aesthetic feeling and a feeling of activity, were extracted from the results. Then, quality impairment factors affecting these two psychological factors were analyzed. We found that the aesthetic feeling is mainly affected by IP packet loss and video coding bit rate, and the feeling of activity depends on delay time and video frame rate. We then proposed an opinion model derived from the relationships among quality impairment factors, psychological factors, and overall quality. The results indicated that the estimation error of the proposed model is almost equivalent to the statistical reliability of the subjective score. Finally, using the proposed model, we discuss guidelines for quality design of interactive audiovisual communication services.

  13. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  14. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  15. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  16. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  17. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  18. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    International Nuclear Information System (INIS)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed

  19. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  20. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    Science.gov (United States)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  1. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    International Nuclear Information System (INIS)

    Rasam, A R A; Ghazali, R; Noor, A M M; Mohd, W M N W; Hamid, J R A; Bazlan, M J; Ahmad, N

    2014-01-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia

  2. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Avila P, P.

    1990-03-01

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  3. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  4. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  5. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  6. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  7. Identifying sources of atmospheric fine particles in Havana City using Positive Matrix Factorization technique

    International Nuclear Information System (INIS)

    Pinnera, I.; Perez, G.; Ramos, M.; Guibert, R.; Aldape, F.; Flores M, J.; Martinez, M.; Molina, E.; Fernandez, A.

    2011-01-01

    In previous study a set of samples of fine and coarse airborne particulate matter collected in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique. The concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were consistently determined in both particle sizes. The analytical database provided by PIXE was statistically analyzed in order to determine the local pollution sources. The Positive Matrix Factorization (PMF) technique was applied to fine particle data in order to identify possible pollution sources. These sources were further verified by enrichment factor (EF) calculation. A general discussion about these results is presented in this work. (Author)

  8. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  9. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  10. Use of Atomic and Nuclear Techniques in Elemental and Isotopic Analysis

    International Nuclear Information System (INIS)

    2008-01-01

    This book is divided into four chapters which were presented by six authors of the best Arab specialists who have used the atomic and nuclear techniques for a long time and recognized their importance and capabilities in scientific researches. Atomic and Nuclear techniques are very successful in the field of analysis because they are the only way to proceed the analysis process with the requested accuracy and they are the cheapest at the same time. A number of these techniques were collected in this book on the basis of their accuracy and the abundance of using them in the analysis of material components, specially when these elements exist with insignificant percentage as in the case of poisons science, archaeology, nutrition, medicine and other applications.

  11. The Infinitesimal Jackknife with Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  12. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  13. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  14. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  15. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  16. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  17. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  18. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Lithuanian Population Aging Factors Analysis

    Directory of Open Access Journals (Sweden)

    Agnė Garlauskaitė

    2015-05-01

    Full Text Available The aim of this article is to identify the factors that determine aging of Lithuania’s population and to assess the influence of these factors. The article shows Lithuanian population aging factors analysis, which consists of two main parts: the first describes the aging of the population and its characteristics in theoretical terms. Second part is dedicated to the assessment of trends that influence the aging population and demographic factors and also to analyse the determinants of the aging of the population of Lithuania. After analysis it is concluded in the article that the decline in the birth rate and increase in the number of emigrants compared to immigrants have the greatest impact on aging of the population, so in order to show the aging of the population, a lot of attention should be paid to management of these demographic processes.

  20. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  1. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  2. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  3. Application of energy dispersive x-ray techniques for water analysis

    International Nuclear Information System (INIS)

    Funtua, I. I.

    2000-07-01

    Energy dispersive x-ray fluorescence (EDXRF) is a class of emission spectroscopic techniques that depends upon the emission of characteristic x-rays following excitation of the atomic electron energy levels by tube or isotopic source x-rays. The technique has found wide range of applications that include determination of chemical elements of water and water pollutants. Three EDXRF systems, the isotopic source, secondary target and total reflection (TXRF) are available at the Centre for Energy research and Training. These systems have been applied for the analysis of sediments, suspensions, ground water, river and rainwater. The isotopic source is based on 55 Fe, 109 Cd and 241 Am excitations while the secondary target and the total reflection are utilizing a Mo x-ray tube. Sample preparation requirements for water analysis range from physical and chemical pre-concentration steps to direct analysis and elements from Al to U can be determined with these systems. The EDXRF techniques, TXRF in particular with its multielement capability, low detection limit and possibility of direct analysis for water have competitive edge over the traditional methods of atomic absorption and flame photometry

  4. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    Science.gov (United States)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  5. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  6. Factor analysis with a priori knowledge - application in dynamic cardiac SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, A.; Di Bella, E.V.R.; Gullberg, G.T. [Medical Imaging Research Laboratory, Department of Radiology, University of Utah, CAMT, 729 Arapeen Drive, Salt Lake City, UT 84108-1218 (United States)

    2000-09-01

    Two factor analysis of dynamic structures (FADS) methods for the extraction of time-activity curves (TACs) from cardiac dynamic SPECT data sequences were investigated. One method was based on a least squares (LS) approach which was subject to positivity constraints. The other method was the well known apex-seeking (AS) method. A post-processing step utilizing a priori information was employed to correct for the non-uniqueness of the FADS solution. These methods were used to extract {sup 99m}Tc-teboroxime TACs from computer simulations and from experimental canine and patient studies. In computer simulations, the LS and AS methods, which are completely different algorithms, yielded very similar and accurate results after application of the correction for non-uniqueness. FADS-obtained blood curves correlated well with curves derived from region of interest (ROI) measurements in the experimental studies. The results indicate that the factor analysis techniques can be used for semi-automatic estimation of activity curves derived from cardiac dynamic SPECT images, and that they can be used for separation of physiologically different regions in dynamic cardiac SPECT studies. (author)

  7. Factor Economic Analysis at Forestry Enterprises

    Directory of Open Access Journals (Sweden)

    M.Yu. Chik

    2018-03-01

    Full Text Available The article studies the importance of economic analysis according to the results of research of scientific works of domestic and foreign scientists. The calculation of the influence of factors on the change in the cost of harvesting timber products by cost items has been performed. The results of the calculation of the influence of factors on the change of costs on 1 UAH are determined using the full cost of sold products. The variable and fixed costs and their distribution are allocated that influences the calculation of the impact of factors on cost changes on 1 UAH of sold products. The paper singles out the general results of calculating the influence of factors on cost changes on 1 UAH of sold products. According to the results of the analysis, the list of reserves for reducing the cost of production at forest enterprises was proposed. The main sources of reserves for reducing the prime cost of forest products at forest enterprises are investigated based on the conducted factor analysis.

  8. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  9. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  10. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  11. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  12. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  13. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  14. Bayesian linkage and segregation analysis: factoring the problem.

    Science.gov (United States)

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  15. Structural studies of formic acid using partial form-factor analysis

    International Nuclear Information System (INIS)

    Swan, G.; Dore, J.C.; Bellissent-Funel, M.C.

    1993-01-01

    Neutron diffraction measurements have been made of liquid formic acid using H/D isotopic substitution. Data are recorded for samples of DCOOD, HCOOD and a (H/D)COOD mixture (α D =0.36). A first-order difference method is used to determine the intra-molecular contribution through the introduction of a partial form-factor analysis technique incorporating a hydrogen-bond term. The method improves the sensitivity of the parameters defining the molecular geometry and avoids some of the ambiguities arising from terms involving spatial overlap of inter- and intra-molecular features. The possible application to other systems is briefly reviewed. (authors). 8 figs., 2 tabs., 8 refs

  16. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  17. Application of Avco data analysis and prediction techniques (ADAPT) to prediction of sunspot activity

    Science.gov (United States)

    Hunter, H. E.; Amato, R. A.

    1972-01-01

    The results are presented of the application of Avco Data Analysis and Prediction Techniques (ADAPT) to derivation of new algorithms for the prediction of future sunspot activity. The ADAPT derived algorithms show a factor of 2 to 3 reduction in the expected 2-sigma errors in the estimates of the 81-day running average of the Zurich sunspot numbers. The report presents: (1) the best estimates for sunspot cycles 20 and 21, (2) a comparison of the ADAPT performance with conventional techniques, and (3) specific approaches to further reduction in the errors of estimated sunspot activity and to recovery of earlier sunspot historical data. The ADAPT programs are used both to derive regression algorithm for prediction of the entire 11-year sunspot cycle from the preceding two cycles and to derive extrapolation algorithms for extrapolating a given sunspot cycle based on any available portion of the cycle.

  18. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  19. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  20. A Retrospective Analysis of Factors Affecting Early Stoma Complications.

    Science.gov (United States)

    Koc, Umit; Karaman, Kerem; Gomceli, Ismail; Dalgic, Tahsin; Ozer, Ilter; Ulas, Murat; Ercan, Metin; Bostanci, Erdal; Akoglu, Musa

    2017-01-01

    Despite advances in surgical techniques and products for stoma care, stoma-related complications are still common. A retrospective analysis was performed of the medical records of 462 consecutive patients (295 [63.9%] female, 167 [36.1 %] male, mean age 55.5 ± 15.1 years, mean body mass index [BMI] 25.1 ± 5.2) who had undergone stoma creation at the Gastroenterological Surgery Clinic of Turkiye Yuksek İhtisas Teaching and Research Hospital between January 2008 and December 2012 to examine the incidence of early (ie, within 30 days after surgery) stoma complications and identify potential risk factors. Variables abstracted included gender, age, and BMI; existence of malignant disease; comorbidities (diabetes mellitus, hypertension, coronary artery disease, chronic respiratory disease); use of neoadjuvant chemoradiotherapy; permanent or temporary stoma; type of stoma (loop/end stoma); stoma localization; and the use of preoperative marking of the stoma site. Data were entered and analyzed using statistical software. Descriptive statistics, chi-squared, and Mann-Whitney U tests were used to describe and analyze all variables, and logistic regression analysis was used to determine independent risk factors for stoma complications. Ostomy-related complications developed in 131 patients (28.4%) Of these, superficial mucocutaneous separation was the most frequent complication (90 patients, 19.5%), followed by stoma retraction (15 patients, 3.2%). In univariate analysis, malignant disease (P = .025), creation of a colostomy (P = .002), and left lower quadrant stoma location (P toma complication. Only stoma location was an independent risk factor for the development of a stoma complication (P = .044). The rate of stoma complications was not significantly different between patients who underwent nonemergent surgery (30% in patients preoperatively sited versus 28.4% not sited) and patients who underwent emergency surgery (27.1%). Early stoma complication rates were higher

  1. Sensitivity Analysis Techniques Applied in Video Streaming Service on Eucalyptus Cloud Environments

    Directory of Open Access Journals (Sweden)

    Rosangela Melo

    2018-01-01

    Full Text Available Nowdays, several streaming servers are available to provide a variety of multimedia applications such as Video on Demand in cloud computing environments. These environments have the business potential because of the pay-per-use model, as well as the advantages of easy scalability and, up-to-date of the packages and programs. This paper uses hierarchical modeling and different sensitivity analysis techniques to determine the parameters that cause the greatest impact on the availability of a Video on Demand. The results show that distinct approaches provide similar results regarding the sensitivity ranking, with specific exceptions. A combined evaluation indicates that system availability may be improved effectively by focusing on a reduced set of factors that produce large variation on the measure of interest.

  2. Are factor analytical techniques used appropriately in the validation of health status questionnaires?

    DEFF Research Database (Denmark)

    de Vet, Henrica C W; Adér, Herman J; Terwee, Caroline B

    2005-01-01

    Factor analysis is widely used to evaluate whether questionnaire items can be grouped into clusters representing different dimensions of the construct under study. This review focuses on the appropriate use of factor analysis. The Medical Outcomes Study Short Form-36 (SF-36) is used as an example...... of the results and conclusions was often incomplete. Some of our results are specific for the SF-36, but the finding that both the application and the reporting of factor analysis leaves much room for improvement probably applies to other health status questionnaires as well. Optimal reporting and justification...

  3. Using Separable Nonnegative Matrix Factorization Techniques for the Analysis of Time-Resolved Raman Spectra

    Science.gov (United States)

    Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.

    2016-09-01

    The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.

  4. Nuclear techniques for on-line analysis in the mineral and energy industries

    International Nuclear Information System (INIS)

    Sowerby, B.D.; Watt, J.S.

    1994-01-01

    Nuclear techniques are the basis of many on-line analysis systems which are now widely used in the mineral and energy industries. Some of the systems developed by the CSIRO depend entirely on nuclear techniques; others use a combination of nuclear techniques and microwave, capacitance, or ultrasonic techniques. The continuous analysis and rapid response of these CSIRO systems has led to improved control of mining, processing and blending operations, with increased productivity valued at A$50 million per year to Australia, and $90 million per year world wide. This paper reviews developments in nuclear on-line analysis systems by the On-Line Analysis Group in CSIRO at Lucas Heights. Commercialised systems based on this work analyse mineral and coal slurries and determine the ash and moisture contents of coal and coke on conveyors. This paper also reviews two on-line nuclear analysis systems recently developed and licensed to industry, firstly for the determination of the mass flow rates of oil/water/gas mixtures in pipelines, and secondly for determination of the moisture, specific energy, ash and fouling index in low rank coals. 8 refs., 3 tabs., 4 figs

  5. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  6. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    Energy Technology Data Exchange (ETDEWEB)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results.

  7. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures

    International Nuclear Information System (INIS)

    Cavailloles, F.; Valette, H.; Hebert, J.-L.; Bazin, J.-P.; Di Paola, R.; Capderou, A.

    1987-01-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independent method provides more objective and reproducible results. (author)

  8. Analysis of rocks involving the x-ray diffraction, infrared and thermal gravimetric techniques

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Munir, N.

    1998-01-01

    Chemical analysis of rocks and minerals are usually obtained by a number of analytical techniques. The purpose of present work is to investigate the chemical composition of the rock samples and also to find that how far the results obtained by different instrumental methods are closely related. Chemical tests wee performed before using the instrumental techniques in order to determined the nature of these rocks. The chemical analysis indicated mainly the presence of carbonate and hence the carbonate nature of these rocks. The x-ray diffraction, infrared spectroscopy and thermal gravimetric analysis techniques were used for the determination of chemical composition of these samples. The results obtained by using these techniques have shown a great deal of similarities. (author)

  9. Analysis of factors affecting the effect of stope leaching

    International Nuclear Information System (INIS)

    Xie Wangnan; Dong Chunming

    2014-01-01

    The industrial test and industrial trial production of stope leaching were carried out at Taoshan orefield of Dabu deposit. The results of test and trial production showed obvious differences in leaching rate and leaching time. Compared with industrial trial production of stope leaching, the leaching rate of industrial test was higher, and leaching time was shorter. It was considered that the blasting method and liquid arrangement were the main factors affecting the leaching rate and leaching time according to analysis. So we put forward the following suggestions: the technique of deep hole slicing tight-face blasting was used to reduce the yield of lump ores, the effective liquid arrangement methods were adopted to make the lixiviant infiltrating throughout whole ore heap, and bacterial leaching was introduced. (authors)

  10. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  11. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  12. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  13. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  14. Measurement of g factors of excited states in radioactive beams by the transient field technique: 132Te

    International Nuclear Information System (INIS)

    Benczer-Koller, N.; Kumbartzki, G.; Gurdal, G; Gross, Carl J; Krieger, B; Hatarik, Robert; O'Malley, Patrick; Pain, S. D.; Segen, L.; Baktash, Cyrus; Bingham, C. R.; Danchev, M.; Grzywacz, R.; Mazzocchi, C.

    2008-01-01

    The g factor of the 2 1 + state in 52 132 Te, E(2 1 + ) = 0.9739 MeV, r = 2.6 ps, was measured by the transient field technique applied to a radioactive beam. The development of an experimental approach necessary for work in radioactive beam environments is described. The result g = 0.28(15) agrees with the previous measurement by the recoil-in-vacuum technique, but here the sign of the g factor is measured as well

  15. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  16. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  17. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  18. A Rasch and factor analysis of the Functional Assessment of Cancer Therapy-General (FACT-G

    Directory of Open Access Journals (Sweden)

    Selby Peter J

    2007-04-01

    Full Text Available Abstract Background Although the Functional Assessment of Cancer Therapy – General questionnaire (FACT-G has been validated few studies have explored the factor structure of the instrument, in particular using non-sample dependent measurement techniques, such as Rasch Models. Furthermore, few studies have explored the relationship between item fit to the Rasch Model and clinical utility. The aim of this study was to investigate the dimensionality and measurement properties of the FACT-G with Rasch Models and Factor analysis. Methods A factor analysis and Rasch analysis (Partial Credit Model was carried out on the FACT-G completed by a heterogeneous sample of cancer patients (n = 465. For the Rasch analysis item fit (infit mean squares ≥ 1.30, dimensionality and item invariance were assessed. The impact of removing misfitting items on the clinical utility of the subscales and FACT-G total scale was also assessed. Results The factor analysis demonstrated a four factor structure of the FACT-G which broadly corresponded to the four subscales of the instrument. Internal consistency for these four scales was very good (Cronbach's alpha 0.72 – 0.85. The Rasch analysis demonstrated that each of the subscales and the FACT-G total scale had misfitting items (infit means square ≥ 1.30. All these scales with the exception of the Social & Family Well-being Scale (SFWB were unidimensional. When misfitting items were removed, the effect sizes and the clinical utility of the instrument were maintained for the subscales and the total FACT-G scores. Conclusion The results of the traditional factor analysis and Rasch analysis of the FACT-G broadly agreed. Caution should be exercised when utilising the Social & Family Well-being scale and further work is required to determine whether this scale is best represented by two factors. Additionally, removing misfitting items from scales should be performed alongside an assessment of the impact on clinical utility.

  19. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization techniques.

    Science.gov (United States)

    Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan

    2013-06-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.

  20. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  1. Analysis Of Factors Causing Delays On Harun Nafsi - Hm Rifadin Street In Samarinda East Kalimantan Maintenance Project

    Directory of Open Access Journals (Sweden)

    Fadli

    2017-12-01

    Full Text Available This study aims to identify analyze and describe the factors that affect the project maintenance delay on Harun Nafsi - HM. Rifadin Street in Samarinda East Kalimantan. This research uses qualitative research method by utilizing questionnaires. The 30 participating respondents consist of 14 project implementers and 16 field implementers. The data are analyzed by descriptive statistical technique factor analysis and linear regression analysis. The results show that the factors influencing the delay of maintenance project of Harun Nafis - HM Rifadin Street include 1 time factor and workmanship factor 2 human resources and natural factors 3 geographical conditions late approval plans change and labor strikes and 4 non-optimal working levels and changes in the scope of the project during the work are still ongoing. Based on multiple linear regression analysis coefficient of determination value of 0.824 is obtained. It means that the four factors studied affect 82.4 of project delays and the rest of 27.6 is influenced by other variables out of this study. The results of this study also indicate that the dominant factor for road maintenance project delays is the fourth factor of the factors mentioned. The effort that the contractor needs to undertake is not to expand the employment contract if the project is underway or the contractor does not have the capability to complete another project.

  2. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  3. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  4. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  5. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  6. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  7. Development of human factors experimental evaluation techniques - Human factors design and evaluation of the main control room of atomic power plants using visual display terminals

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sung Ho; Chung, Min Kyun; Choi, Kyung Lim; Song, Young Woong; Eoh, Hong Joon; Lee, In Suk; Kim, Bom Soo; Yeo, Yun Shin; Lee, Haeo Sun [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    1995-07-01

    This study was conducted to build a prototype of the operator interface on a visual display terminal, and to provide a framework for using and applying prototyping techniques. A typical subsystem of an MCR was prototyped and validated by operator testing. Lessons learned during the development as well as prototyping techniques were described in this report for an efficient development. In addition, human factors experimental plans were surveyed and summarized for evaluating new design alternatives as well as the current design of the operator interface. The major results of this study are listed as follow: A method for designing an operator interface prototype on a VDT, A prototype of the operator interface of a typical subsystem, Guidelines for applying prototyping techniques, Characteristics and major considerations of experimental plans, Guidelines for applying prototyping techniques, Characteristics and major considerations of experimental plans, Guidelines for analyzing experimental data, A paradigm for human factors experimentation including experimental designs, procedures, and data analyses. 27 refs., 60 tabs., 47 figs. (author)

  8. Variable Eddington factors and flux-limiting diffusion coefficients

    International Nuclear Information System (INIS)

    Whalen, P.P.

    1982-01-01

    Variable Eddington factors and flux limiting diffusion coefficients arise in two common techniques of closing the moment equations of transport. The first two moment equations of the full transport equation are still frequently used to solve many problems of radiative or particle transport. An approximate analysis, developed by Levermore, exhibits the relation between the coefficients of the two different techniques. This analysis is described and then used to test the validity of several commonly used flux limiters and Eddington factors. All of the ad-hoc flux limiters have limited validity. All of the variable Eddington factors derived from some underlying description of the angular distribution function are generally valid. The use of coefficients from Minerbo's elegant maximum entropy Eddington factor analysis is suggested for use in either flux limited diffusion or variable Eddington factor equations

  9. An Analysis of Trainers' Perspectives within an Ecological Framework: Factors that Influence Mine Safety Training Processes

    Directory of Open Access Journals (Sweden)

    Emily J. Haas

    2014-09-01

    Conclusion: This study offers a new technique to identify limitations in safety training systems and processes. The analysis suggests that training should be developed and disseminated with consideration of various levels—individual, interpersonal, organizational, and community—to promote skills. If factors identified within and between levels are addressed, it may be easier to sustain mineworker competencies that are established during safety training.

  10. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  11. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  12. The development of human behaviour analysis techniques -The development of human factors technologies-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Cheon, Se Woo; Shu, Sang Moon; Park, Geun Ok; Lee, Yong Hee; Lee, Han Yeong; Park, Jae Chang; Lee, Eu Jin; Lee, Seung Hee

    1994-04-01

    This project has two major areas ; one is the development of an operator task simulation software and another is the development of human error analysis and application technologies. In this year project, the second year, for the development of an operator task simulation software, we studied the followings: - analysis of the characteristics of operator tasks, - development of operator task structures : Macro Structures, - development of an operator task simulation analyzes, - analysis of performance measures. And the followings for the development of human error analysis and application technologies : - analysis of human error mechanisms, - analysis of human error characteristics in tasks, - analysis of human error occurrence in Korean Nuclear Power Plants, - establishment of an experimental environment for human error data collection with Compact Nuclear Simulator, - basic design of a Multimedia-based Human Error Representing System. (Author)

  13. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  14. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  15. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  16. Control charts technique - a tool to data analysis for chemical experiments

    International Nuclear Information System (INIS)

    Yadav, M.B.; Venugopal, V.

    1999-01-01

    A procedure using control charts technique has been developed to analyse data of a chemical experiment which was conducted to assign a value to uranium content in Rb 2 U(SO 4 ) 3 . A value of (34.164 ± 0.031)% has been assigned against (34.167 ± 0.042)% already assigned by analysis of variance (ANOVA) technique. These values do not differ significantly. Merits and demerits of the two techniques have been discussed. (author)

  17. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  18. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  19. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  20. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    Science.gov (United States)

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  1. Bad splits in bilateral sagittal split osteotomy: systematic review and meta-analysis of reported risk factors.

    Science.gov (United States)

    Steenen, S A; van Wijk, A J; Becking, A G

    2016-08-01

    An unfavourable and unanticipated pattern of the bilateral sagittal split osteotomy (BSSO) is generally referred to as a 'bad split'. Patient factors predictive of a bad split reported in the literature are controversial. Suggested risk factors are reviewed in this article. A systematic review was undertaken, yielding a total of 30 studies published between 1971 and 2015 reporting the incidence of bad split and patient age, and/or surgical technique employed, and/or the presence of third molars. These included 22 retrospective cohort studies, six prospective cohort studies, one matched-pair analysis, and one case series. Spearman's rank correlation showed a statistically significant but weak correlation between increasing average age and increasing occurrence of bad splits in 18 studies (ρ=0.229; Pbad split among the different splitting techniques. A meta-analysis pooling the effect sizes of seven cohort studies showed no significant difference in the incidence of bad split between cohorts of patients with third molars present and concomitantly removed during surgery, and patients in whom third molars were removed at least 6 months preoperatively (odds ratio 1.16, 95% confidence interval 0.73-1.85, Z=0.64, P=0.52). In summary, there is no robust evidence to date to show that any risk factor influences the incidence of bad split. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  2. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  3. Measurements of K shell absorption jump factors and jump ratios using EDXRF technique

    Science.gov (United States)

    Kacal, Mustafa Recep; Han, İbrahim; Akman, Ferdi

    2015-04-01

    In the present work, the K-shell absorption jump factors and jump ratios for 30 elements between Ti ( Z = 22) and Er ( Z = 68) were measured by energy dispersive X-ray fluorescence (EDXRF) technique. The jump factors and jump ratios for these elements were determined by measuring the K shell fluorescence parameters such as the Kα X-ray production cross-sections, K shell fluorescence yields, Kβ-to- Kα X-rays intensity ratios, total atomic absorption cross sections and mass attenuation coefficients. The measurements were performed using an Am-241 radioactive point source and a Si (Li) detector in direct excitation and transmission experimental geometry. The results for jump factors and jump ratios were compared with theoretically calculated and the ones available in the literature.

  4. The Factors Affecting the Sensitivity of the Ultrasonic Inter-Modulation Technique

    International Nuclear Information System (INIS)

    Courtney, C. R. P.; Drinkwater, B. W.; Neild, S. A.; Wilcox, P. D.

    2007-01-01

    A global non-destructive testing technique for detecting cracks in metal parts has been developed and the factors affecting its sensitivity investigated. A sample is excited at very-high-order modes of vibration at two frequencies and the frequency mixing measured. Experiments with fatigue-cracked steel beams demonstrate that these defects produce a strong mixing effect and that the signal relating to the frequency mixing is sensitive to the length of the crack. The sensitivity is also shown to be reliant on the modes of vibration used

  5. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  6. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  7. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  8. Success rate and risk factors of failure of the induced membrane technique in children: a systematic review.

    Science.gov (United States)

    Aurégan, Jean-Charles; Bégué, Thierry; Rigoulot, Guillaume; Glorion, Christophe; Pannier, Stéphanie

    2016-12-01

    The induced membrane technique was designed by Masquelet et al. to address segmental bone defects of critical size in adults. It has been used after bone defects of traumatic, infectious and tumoral origin with satisfactory results. Recently, it has been used in children but, after an initial enthusiasm, several cases of failure have been reported. The purpose of this study was to assess the success rate and the risk factors of failure of the induced membrane for children. We conducted a systematic review of all the studies reporting the results of the induced membrane technique to address bone defects of critical size in children. Our primary outcome was the success rate of the technique defined as a bone union before any iterative surgery. Our secondary outcomes were the complications and the risk factors of failure. We searched Medline via Pubmed, EMBASE and the Cochrane Library. Twelve studies, including 69 patients, met the inclusion criteria. There were 41 boys and 28 girls. Mean age at surgery was 10 years. Mean size of resection was 12.38 cm and the mean time between the two stages was 5.86 months. Mean rate of bone union after the two stages of the induced membrane technique was 58% (40/69) but this rate increased to 87% after revision surgeries (60/69). Main complications were non-unions (19/69), lysis of the graft (6/69) and fractures of the bone graft (6/69). Only 1/69 deep infection was reported. Other non specific complications were regularly reported such limb length discrepancies, joint stiffness and protruding wires. Risk factor of failure that could be suspected comprised the resection of a malignant tumour, a bone defect located at the femur, a wide resection, a long time between the two stages, an unstable osteosynthesis and a bone graft associating autograft to other graft materials. The induced membrane technique is suitable for bone defects of critical size in children. It is a reliable technique with no need of micro vascular surgery

  9. Sentiment Analysis in Geo Social Streams by using Machine Learning Techniques

    OpenAIRE

    Twanabasu, Bikesh

    2018-01-01

    Treball de Final de Màster Universitari Erasmus Mundus en Tecnologia Geoespacial (Pla de 2013). Codi: SIW013. Curs acadèmic 2017-2018 Massive amounts of sentiment rich data are generated on social media in the form of Tweets, status updates, blog post, reviews, etc. Different people and organizations are using these user generated content for decision making. Symbolic techniques or Knowledge base approaches and Machine learning techniques are two main techniques used for analysis sentiment...

  10. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  11. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  12. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  13. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  14. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  15. Worldwide analysis of marine oil spill cleanup cost factors

    International Nuclear Information System (INIS)

    Etkin, D.S.

    2000-01-01

    The many factors that influence oil spill response costs were discussed with particular emphasis on how spill responses differ around the world because of differing cultural values, socio-economic factors and labor costs. This paper presented an analysis of marine oil spill cleanup costs based on the country, proximity to shoreline, spill size, oil type, degree of shoreline oiling and cleanup methodology. The objective was to determine how each factor impacts per-unit cleanup costs. Near-shore spills and in-port spills were found to be 4-5 times more expensive to clean than offshore spills. Responses to spills of heavy fuels also cost 10 times more than for lighter crudes and diesel. Spill responses for spills under 30 tonnes are 10 times more costly than on a per-unit basis, for spills of 300 tonnes. A newly developed modelling technique that can be used on different types of marine spills was described. It is based on updated cost data acquired from case studies of more than 300 spills in 40 countries. The model determines a per-unit cleanup cost estimation by taking into consideration oil type, location, spill size, cleanup methodology, and shoreline oiling. It was concluded that the actual spill costs are totally dependent on the actual circumstances of the spill. 13 refs., 10 tabs., 3 figs

  16. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  17. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  18. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  19. Incorporating organizational factors into Probabilistic Risk Assessment (PRA) of complex socio-technical systems: A hybrid technique formalization

    International Nuclear Information System (INIS)

    Mohaghegh, Zahra; Kazemi, Reza; Mosleh, Ali

    2009-01-01

    This paper is a result of a research with the primary purpose of extending Probabilistic Risk Assessment (PRA) modeling frameworks to include the effects of organizational factors as the deeper, more fundamental causes of accidents and incidents. There have been significant improvements in the sophistication of quantitative methods of safety and risk assessment, but the progress on techniques most suitable for organizational safety risk frameworks has been limited. The focus of this paper is on the choice of 'representational schemes' and 'techniques.' A methodology for selecting appropriate candidate techniques and their integration in the form of a 'hybrid' approach is proposed. Then an example is given through an integration of System Dynamics (SD), Bayesian Belief Network (BBN), Event Sequence Diagram (ESD), and Fault Tree (FT) in order to demonstrate the feasibility and value of hybrid techniques. The proposed hybrid approach integrates deterministic and probabilistic modeling perspectives, and provides a flexible risk management tool for complex socio-technical systems. An application of the hybrid technique is provided in the aviation safety domain, focusing on airline maintenance systems. The example demonstrates how the hybrid method can be used to analyze the dynamic effects of organizational factors on system risk

  20. Tribological analysis of nano clay/epoxy/glass fiber by using Taguchi’s technique

    International Nuclear Information System (INIS)

    Senthil Kumar, M.S.; Mohana Sundara Raju, N.; Sampath, P.S.; Vivek, U.

    2015-01-01

    Highlights: • To study the tribological property of modified epoxy with and without E glass fiber. • To analyze the tribological property of specimens by Taguchi’s technique and ANOVA. • To investigate the surface morphology of test specimens with SEM. - Abstract: In this work, a detailed analysis was performed to profoundly study the tribological property of various nano clay (Cloisite 25A) loaded epoxy, with and without inclusion of E-glass fiber using Taguchi’s technique. For this purpose, the test samples were prepared according to the ASTM standard, and the test was carried out with the assistance of pin-on-disk machine. To proceed further, L 25 orthogonal array was constructed to evaluate the tribological property with four control variables such as filler content, normal load, sliding velocity and sliding distance at each level. The results indicated that the combination of factors greatly influenced the process to achieve the minimum wear and coefficient of friction. Overall, the experiment results depicted least wear and friction coefficient for fiber reinforced laminates. In the same way, appreciable wear and friction coefficient was noted for without fiber laminates. Additionally, the SN ratio results too exhibited the similar trend. Moreover, ANOVA analysis revealed that the fiber inclusion on laminates has lesser contribution on coefficient of friction and wear when compared to without fiber laminates. At last, the microstructure behavior of the test samples was investigated with an assistance of Scanning Electron Microscope (SEM) to analyze the surface morphology

  1. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  2. A microhistological technique for analysis of food habits of mycophagous rodents.

    Science.gov (United States)

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  3. The impact of information and other factors on on-farm agrobiodiversity conservation: evidence from a duration analysis of Portuguese fruit growers

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, A.; Dinis, I.; Costa Pinto, L.

    2012-11-01

    In spite of the increasing awareness of the importance of in situ and on-farm conservation of agro biodiversity, there is still limited knowledge about the factors that influence farmers choices in variety adoption. The purpose of this paper is to contribute to a better understanding of the factors that influence farmers adoption of traditional varieties of fruit trees so that better and more effective policy measures aiming at their preservation can be designed. While studies in this area have mainly employed standard probit/logit techniques, in this paper, an econometric technique which addresses simultaneously the issue of sample censoring and the joint determination of the occurrence and timing of adoptions duration analysis was applied. The use of this technique in the analysis of adoption data uncovers a sizably higher effect of information on farmers decisions than that obtained by standard approaches. The results strongly support the idea that good extension services providing reliable and accessible information, as well as technical guidance adapted to local conditions, are fundamental components in determining the adoption of land races. (Author)

  4. An Extended-Tag-Induced Matrix Factorization Technique for Recommender Systems

    Directory of Open Access Journals (Sweden)

    Huirui Han

    2018-06-01

    Full Text Available Social tag information has been used by recommender systems to handle the problem of data sparsity. Recently, the relationships between users/items and tags are considered by most tag-induced recommendation methods. However, sparse tag information is challenging to most existing methods. In this paper, we propose an Extended-Tag-Induced Matrix Factorization technique for recommender systems, which exploits correlations among tags derived by co-occurrence of tags to improve the performance of recommender systems, even in the case of sparse tag information. The proposed method integrates coupled similarity between tags, which is calculated by the co-occurrences of tags in the same items, to extend each item’s tags. Finally, item similarity based on extended tags is utilized as an item relationship regularization term to constrain the process of matrix factorization. MovieLens dataset and Book-Crossing dataset are adopted to evaluate the performance of the proposed algorithm. The results of experiments show that the proposed method can alleviate the impact of tag sparsity and improve the performance of recommender systems.

  5. RELIABILITY OF CERTAIN TESTS FOR EVALUATION OF JUDO TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Slavko Obadov

    2007-05-01

    Full Text Available The sample included 106 judokas. Assessment of the level of mastership of judo techniques was carried out by evaluation of fi ve competent studies. Each subject performed a technique three times and each performance was evaluated by the judges. In order to evaluate measurement of each technique, Cronbach’s coeffi cient of reliability  was calculated. During the procedure the subjects's results were also transformed to factor scores i.e. the results of each performer at the main component of evaluation in the fi ve studies. These factor scores could be used in the subsequent procedure of multivariant statistical analysis.

  6. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  7. Quantitative mineralogical analysis of sandstones using x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Ward, C.R.; Taylor, J.C.

    1999-01-01

    Full text: X-ray diffraction has long been used as a definitive technique for mineral identification based on the measuring the internal atomic or crystal structures present in powdered rocks; soils and other mineral mixtures. Recent developments in data gathering and processing, however, have provided an improved basis for its use as a quantitative tool, determining not only the nature of the minerals but also the relative proportions of the different minerals present. The mineralogy of a series of sandstone samples from the Sydney and Bowen Basins of eastern Australia has been evaluated by X-ray diffraction (XRD) on a quantitative basis using the Australian-developed SIROQUANT data processing technique. Based on Rietveld principles, this technique generates a synthetic X-ray diffractogram by adjusting and combining full-profile patterns of minerals nominated as being present in the sample and interactively matches the synthetic diffractogram under operator instructions to the observed diffractogram of the sample being analysed. The individual mineral patterns may be refined in the process, to allow for variations in crystal structure of individual components or for factors such as preferred orientation in the sample mount. The resulting output provides mass percentages of the different minerals in the mixture, and an estimate of the error associated with each individual percentage determination. The chemical composition of the mineral mixtures indicated by SIROQUANT for each individual sandstone studied was estimated using a spreadsheet routine, and the indicated proportion of each oxide in each sample compared to the actual chemical analysis of the same sandstone as determined independently by X-ray fluorescence spectrometry. The results show a high level of agreement for all major chemical constituents, indicating consistency between the SIROQUANT XRD data and the whole-rock chemical composition. Supplementary testing with a synthetic corundum spike further

  8. Identifying factors affecting optimal management of agricultural water

    Directory of Open Access Journals (Sweden)

    Masoud Samian

    2015-01-01

    In addition to quantitative methodology such as descriptive statistics and factor analysis a qualitative methodology was employed for dynamic simulation among variables through Vensim software. In this study, the factor analysis technique was used through the Kaiser-Meyer-Olkin (KMO and Bartlett tests. From the results, four key elements were identified as factors affecting the optimal management of agricultural water in Hamedan area. These factors were institutional and legal factors, technical and knowledge factors, economic factors and social factors.

  9. The prognosis and prognostic risk factors of patients with hepatic artery complications after liver transplantation treated with the interventional techniques

    International Nuclear Information System (INIS)

    Shan Hong; Huang Mingsheng; Jiang Zaipo; Zhu Kangshun; Yang Yang; Chen Guihua

    2008-01-01

    Objective: To investigate the prognosis and prognostic risk factors of hepatic artery complications after orthotopic liver transplantation (OLT) treated with the interventional techniques. Methods: The clinical data of 21 patients with hepatic artery complication after liver transplantation receiving thrombolysis, PTA, and stent placement in our institute from November 2003 to April 2007 were retrospectively analyzed. Based on the prognosis of grafts, 21 patients were divided into poor-prognosis group and non-poor-prognosis group. Fifteen variables (including biliary complication, hepatic artery restenosis, early or late artery complication, and so on) were analyzed in both groups with binary logistic regression analysis to screen out the risk factors related to prognosis of pereutaneous interventional treatment for hepatic artery complications after OLT. Results: Twenty-one patients were followed for mean 436 days, median 464 days (3-1037 days). The poor-prognosis group included 11 patients (5 cases received retransplantation, and 6 died). The mean survival time of grafts in poor-prognosis group was 191 days, and median survival time was 73 days (3-616 days). The mean survival time of grafts in non-poor-prognosis group which included 10 patients was 706 days, and median survival time was 692 days (245-1037 days). Univariate analysis showed there were significant difference in biliary complication, total bilimbin and indirect bilirubin between the two groups. The binary, logistic regression analysis showed the risk factor related to prognosis was with biliary complication before the interventional management (P=0.027, OR=22.818). Conclusion: Biliary complication before interventional management is the risk factor related to poor prognosis of patients with hepatic artery stenosis or thrombosis receiving interventional treatment. (authors)

  10. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    Science.gov (United States)

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (Pregression analysis of total resection-related factors showed that total resection should be the preferred treatment for patients with benign tumors, thoracic and lumbosacral tumors, and lower McCormick grade, as well as patients without syringomyelia and intramedullary tumors. Logistic regression analysis of recurrence-related factors revealed that the recurrence rate was relatively higher in patients with malignant, cervical, thoracic and lumbosacral, intramedullary tumors, and higher Mc

  11. Application of optimal estimation techniques to FFTF decay heat removal analysis

    International Nuclear Information System (INIS)

    Nutt, W.T.; Additon, S.L.; Parziale, E.A.

    1979-01-01

    The verification and adjustment of plant models for decay heat removal analysis using a mix of engineering judgment and formal techniques from control theory are discussed. The formal techniques facilitate dealing with typical test data which are noisy, redundant and do not measure all of the plant model state variables directly. Two pretest examples are presented. 5 refs

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  14. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  15. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  16. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  17. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  18. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  19. Comparative study of macrotexture analysis using X-ray diffraction and electron backscattered diffraction techniques

    International Nuclear Information System (INIS)

    Serna, Marilene Morelli

    2002-01-01

    The macrotexture is one of the main characteristics in metallic materials, which the physical properties depend on the crystallographic direction. The analysis of the macrotexture to middles of the decade of 80 was just accomplished by the techniques of Xray diffraction and neutrons diffraction. The possibility of the analysis of the macrotexture using, the technique of electron backscattering diffraction in the scanning electronic microscope, that allowed to correlate the measure of the orientation with its location in the micro structure, was a very welcome tool in the area of engineering of materials. In this work it was studied the theoretical aspects of the two techniques and it was used of both techniques for the analysis of the macrotexture of aluminum sheets 1050 and 3003 with intensity, measured through the texture index 'J', from 2.00 to 5.00. The results obtained by the two techniques were shown reasonably similar, being considered that the statistics of the data obtained by the technique of electron backscatter diffraction is much inferior to the obtained by the X-ray diffraction. (author)

  20. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  1. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Directory of Open Access Journals (Sweden)

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  2. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  3. Performance evaluation using bootstrapping DEA techniques: Evidence from industry ratio analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2010-01-01

    In Data Envelopment Analysis (DEA) context financial data/ ratios have been used in order to produce a unified measure of performance metric. However, several scholars have indicated that the inclusion of financial ratios create biased efficiency estimates with implications on firms’ and industries’ performance evaluation. There have been several DEA formulations and techniques dealing with this problem including sensitivity analysis, Prior-Ratio-Analysis and DEA/ output–input ratio analysis ...

  4. A task analysis-linked approach for integrating the human factor in reliability assessments of nuclear power plants

    International Nuclear Information System (INIS)

    Ryan, T.G.

    1988-01-01

    This paper describes an emerging Task Analysis-Linked Evaluation Technique (TALENT) for assessing the contributions of human error to nuclear power plant systems unreliability and risk. Techniques such as TALENT are emerging as a recognition that human error is a primary contributor to plant safety, however, it has been a peripheral consideration to data in plant reliability evaluations. TALENT also recognizes that involvement of persons with behavioral science expertise is required to support plant reliability and risk analyses. A number of state-of-knowledge human reliability analysis tools are also discussed which support the TALENT process. The core of TALENT is comprised of task, timeline and interface analysis data which provide the technology base for event and fault tree development, serve as criteria for selecting and evaluating performance shaping factors, and which provide a basis for auditing TALENT results. Finally, programs and case studies used to refine the TALENT process are described along with future research needs in the area. (author)

  5. Classification analysis of organization factors related to system safety

    International Nuclear Information System (INIS)

    Liu Huizhen; Zhang Li; Zhang Yuling; Guan Shihua

    2009-01-01

    This paper analyzes the different types of organization factors which influence the system safety. The organization factor can be divided into the interior organization factor and exterior organization factor. The latter includes the factors of political, economical, technical, law, social culture and geographical, and the relationships among different interest groups. The former includes organization culture, communication, decision, training, process, supervision and management and organization structure. This paper focuses on the description of the organization factors. The classification analysis of the organization factors is the early work of quantitative analysis. (authors)

  6. Strategic Management Tools and Techniques Usage: a Qualitative Review

    Directory of Open Access Journals (Sweden)

    Albana Berisha Qehaja

    2017-01-01

    Full Text Available This paper is one of the few studies to review the empirical literature on strategic management tools and techniques usage. There are many techniques, tools and methods, models, frameworks, approaches and methodologies, available to support strategic managers in decision making. They are developed and designed to support managers in all stages of strategic management process to achieve better performance. Management schools provide knowledge of these tools. But their use in organizations should be seen in practice‑based context. Consequently, some questions arise: Do they use these strategic tools and techniques in their workplace? Which strategic tools and techniques are used more in organizations? To answer these questions we have made a review of empirical studies using textual narrative synthesis method. Initially, this study presents a tabulation with a summary of empirical research for the period 1990–2015. The included studies are organized clustering them by enterprise size and sector and by country level development. A synopsis of the ten most used strategic tools and techniques worldwide resulted as follows: SWOT analysis, benchmarking, PEST analysis, “what if” analysis, vision and mission statements, Porter’s five forces analysis, business financial analysis, key success factors analysis, cost‑benefit analysis and customer satisfaction.

  7. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  8. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  9. Analysis of fresh fallout from Chinese tests by beta counting technique

    International Nuclear Information System (INIS)

    Mishra, U.C.; Lalit, B.Y.; Shukla, V.K.; Ramachandran, T.V.

    1979-01-01

    The paper describes beta counting techniques used in the analysis of fresh radioactive fallout samples from nuclear weapon tests. Fresh fallout samples have been collected by swiping the exposed portion of the engine covers of commercial aircrafts arriving at Bombay from New York after Chinese tests on September 26, 1976 and September 17, 1977. Activities of short-lived radionuclides such as Ag 111, Sr 89, Mo 99, U 237 and Np 239 were determined using these techniques. The results obtained from this analysis is discussed in brief in relation to the kind of fissile material, the extent of thermonuclear reaction in the weapon and the mode of detonation. (orig.) [de

  10. Incorporating organizational factors into Probabilistic Risk Assessment (PRA) of complex socio-technical systems: A hybrid technique formalization

    Energy Technology Data Exchange (ETDEWEB)

    Mohaghegh, Zahra [Center for Risk and Reliability, University of Maryland, College Park, MD 20742 (United States)], E-mail: mohagheg@umd.edu; Kazemi, Reza; Mosleh, Ali [Center for Risk and Reliability, University of Maryland, College Park, MD 20742 (United States)

    2009-05-15

    This paper is a result of a research with the primary purpose of extending Probabilistic Risk Assessment (PRA) modeling frameworks to include the effects of organizational factors as the deeper, more fundamental causes of accidents and incidents. There have been significant improvements in the sophistication of quantitative methods of safety and risk assessment, but the progress on techniques most suitable for organizational safety risk frameworks has been limited. The focus of this paper is on the choice of 'representational schemes' and 'techniques.' A methodology for selecting appropriate candidate techniques and their integration in the form of a 'hybrid' approach is proposed. Then an example is given through an integration of System Dynamics (SD), Bayesian Belief Network (BBN), Event Sequence Diagram (ESD), and Fault Tree (FT) in order to demonstrate the feasibility and value of hybrid techniques. The proposed hybrid approach integrates deterministic and probabilistic modeling perspectives, and provides a flexible risk management tool for complex socio-technical systems. An application of the hybrid technique is provided in the aviation safety domain, focusing on airline maintenance systems. The example demonstrates how the hybrid method can be used to analyze the dynamic effects of organizational factors on system risk.

  11. Analysis of Factors Affecting the Success of Onions Development Program in Kampar Regency

    Science.gov (United States)

    Amalia; Putri, Asgami

    2017-12-01

    The purpose of this study is to analyze the factors influencing the success of the onion plant development program in Kampar regency. The research method used was the applied survey method using interview technique and observation or direct supervision on the location of the object. The briefing of the interviews as well as the accuracy of collecting the required data was guided by the structured questionnaires. Determination technique of location / region sampling was done purposively based on the potency and capacity of commodity development. While the respondents were taken by cluster purvosive sampling method in order to classify the samples in accordance with the purpose of the study, determined by as many as 100 people taken from members of the farmer group. Analytical technique used is by using Logic Regression Analysis to determine the factors that influence the success of the program seen from the characteristics of farmers. From the results of this study it can be concluded that the factors influencing the success of onion development program in Kampar regency were a age (X1), education (X2), income (X3), ethnicity (X4), jobs (X5) And family responsibility (X6) could be made as follows: Log Y (P/1-p) = -1.778 +X10.021 + X20.028 - X30.213 + X41.986 + X52.930 - X60.455 From the above equation, it can be explained that the attributes that are positively related are X1 (age), X2 (education), X4 (ethnicity) and X5 (jobs) while the negative correlates are X3 (income) and X6 (family responsibility). From the logical regression result it can be seen that the significant value influenced the dependent variable, so that when viewed from the table in the equation it was found that factors affecting the success rate of red onion development program in Kampar regency were X2 (education), X4 (ethnicity), X5 (jobs), and X6 (family responsibility).

  12. A Comparative Analysis of Uranium Ore using Laser Fluorimetric and gamma Spectrometry Techniques

    International Nuclear Information System (INIS)

    Madbouly, M.; Nassef, M. H.; El-Mongy, S.A.; Diab, A.M.

    2009-01-01

    A developed chemical separation method was used for the analysis of uranium in a standard U-ore (IAEA-RGU-1) by LASER fluorimetric technique. The non-destructive gamma assay technique was also applied to verify and compare the uranium content analyzed using laser technique. The results of the uranium analysis obtained by laser fluorimetry were found to be in the range of 360 - 420 μg/g with an average value of 390 μg/g. The bias between the measured and the certified value does not exceed 9.9%. For gamma-ray spectrometric analysis, the results of the measured uranium content were found to be in the range of 393.8 - 399.4 μg/g with an average value of 396.3 μg/g. The % difference in the case of γ- assay was 1.6 %. In general, the methods of analysis used in this study are applicable for a precise determination of uranium. It can be concluded that, laser analysis is preferred for assay of uranium ore due to the required small sample weight, the low time of sample preparation and cost of analysis.

  13. FOREIGN DIRECT INVESTMENTS AND THEIR NON-TRADITIONAL QUALITY FACTORS. A VAR ANALYSIS IN ROMANIA AND BULGARIA

    Directory of Open Access Journals (Sweden)

    Magdalena RADULESCU

    2016-05-01

    Full Text Available The aim of this paper is to present an econometric analysis using VAR techniques for emphasizing the political institutional factors, economic freedom factors and the quality of labor force factors impacting on FDIs attracted in Bulgaria and Romania. We used yearly data series between 2000 and 2014, provided by the World Bank. These two countries display a very friendly climate (law income corporate tax, but they attracted large amounts of FDIs only for a short period of time at mid-2000s’. The foreign investments sharply dropped during the crisis, and the perspectives are not so good. The foreign investors claim that high corruption and bureaucracy greatly diminish the advantages of an attractive fiscal environment in these two specific countries.

  14. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  15. Sensitivity analysis techniques applied to a system of hyperbolic conservation laws

    International Nuclear Information System (INIS)

    Weirs, V. Gregory; Kamm, James R.; Swiler, Laura P.; Tarantola, Stefano; Ratto, Marco; Adams, Brian M.; Rider, William J.; Eldred, Michael S.

    2012-01-01

    Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol' method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. - Highlights: ► Sensitivity analysis techniques for a model shock physics problem are compared. ► The model problem and the sensitivity analysis problem have exact solutions. ► Subtle details of the method for computing sensitivity indices can affect the results.

  16. Using BMDP and SPSS for a Q factor analysis.

    Science.gov (United States)

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  17. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  18. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  19. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  20. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  1. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  3. Analysis of factors influencing adoption of okra production ...

    African Journals Online (AJOL)

    Socio-economic factors influencing adoption of okra technology packages in Enugu State, Nigeria was studied and analyzed in 2012. Purposive and multistage random sampling techniques were used in selecting communities, villages and Okra farmers. The sample size was 90 okra farmers (45Awgu and 45 Aninri Okra ...

  4. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  5. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  6. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    Science.gov (United States)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  7. Evaluation of syngas production unit cost of bio-gasification facility using regression analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Yangyang; Parajuli, Prem B.

    2011-08-10

    Evaluation of economic feasibility of a bio-gasification facility needs understanding of its unit cost under different production capacities. The objective of this study was to evaluate the unit cost of syngas production at capacities from 60 through 1800Nm 3/h using an economic model with three regression analysis techniques (simple regression, reciprocal regression, and log-log regression). The preliminary result of this study showed that reciprocal regression analysis technique had the best fit curve between per unit cost and production capacity, with sum of error squares (SES) lower than 0.001 and coefficient of determination of (R 2) 0.996. The regression analysis techniques determined the minimum unit cost of syngas production for micro-scale bio-gasification facilities of $0.052/Nm 3, under the capacity of 2,880 Nm 3/h. The results of this study suggest that to reduce cost, facilities should run at a high production capacity. In addition, the contribution of this technique could be the new categorical criterion to evaluate micro-scale bio-gasification facility from the perspective of economic analysis.

  8. Advanced analysis technique for the evaluation of linear alternators and linear motors

    Science.gov (United States)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  9. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System.

    Science.gov (United States)

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie; Oh, Kook-Hwan

    2017-01-01

    Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20-65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10-2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10-3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53-2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD.

  10. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  11. Use of nuclear techniques for coal analysis in exploration, mining and processing

    International Nuclear Information System (INIS)

    Clayton, C.G.; Wormald, M.R.

    1982-01-01

    Nuclear techniques have a long history of application in the coal industry, during exploration and especially during coal preparation, for the measurement of ash content. The preferred techniques are based on X- and gamma-ray scattering and borehole logging, and on-line equipment incorporating these techniques are now in world-wide routine use. However, gamma-ray techniques are mainly restricted to density measurement and X-ray techniques are principally used for ash determinations. They have a limited range and when used on-line some size reduction of the coal is usually required and a full elemental analysis is not possible. In particular, X- and gamma-ray techniques are insensitive to the principal elements in the combustible component and to many of the important elements in the mineral fraction. Neutron techniques on the other hand have a range which is compatible with on-line requirements and all elements in the combustible component and virtually all elements in the mineral component can be observed. A complete elemental analysis of coal then allows the ash content and the calorific value to be determined on-line. This paper surveys the various nuclear techniques now in use and gives particular attention to the present state of development of neutron methods and to their advantages and limitations. Although it is shown that considerable further development and operational experience are still required, equipment now being introduced has a performance which matches many of the identified requirements and an early improvement in specification can be anticipated

  12. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  13. Human factors analysis of incident/accident report

    International Nuclear Information System (INIS)

    Kuroda, Isao

    1992-01-01

    Human factors analysis of accident/incident has different kinds of difficulties in not only technical, but also psychosocial background. This report introduces some experiments of 'Variation diagram method' which is able to extend to operational and managemental factors. (author)

  14. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  15. Technique for information retrieval using enhanced latent semantic analysis generating rank approximation matrix by factorizing the weighted morpheme-by-document matrix

    Science.gov (United States)

    Chew, Peter A; Bader, Brett W

    2012-10-16

    A technique for information retrieval includes parsing a corpus to identify a number of wordform instances within each document of the corpus. A weighted morpheme-by-document matrix is generated based at least in part on the number of wordform instances within each document of the corpus and based at least in part on a weighting function. The weighted morpheme-by-document matrix separately enumerates instances of stems and affixes. Additionally or alternatively, a term-by-term alignment matrix may be generated based at least in part on the number of wordform instances within each document of the corpus. At least one lower rank approximation matrix is generated by factorizing the weighted morpheme-by-document matrix and/or the term-by-term alignment matrix.

  16. Metabolomic analysis using porcine skin: a pilot study of analytical techniques

    OpenAIRE

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-01-01

    Background: Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. Objectives: We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. ...

  17. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  18. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  19. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  20. Analysis of technological, institutional and socioeconomic factors ...

    African Journals Online (AJOL)

    Analysis of technological, institutional and socioeconomic factors that influences poor reading culture among secondary school students in Nigeria. ... Proliferation and availability of smart phones, chatting culture and social media were identified as technological factors influencing poor reading culture among secondary ...

  1. Exploratory Analysis of the Factors Affecting Consumer Choice in E-Commerce: Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Elena Mazurova

    2017-05-01

    Full Text Available According to previous studies of online consumer behaviour, three factors are the most influential on purchasing behavior - brand, colour and position of the product on the screen. However, a simultaneous influence of these three factors on the consumer decision making process has not been investigated previously. In this particular work we aim to execute a comprehensive study of the influence of these three factors. In order to answer our main research questions, we conducted an experiment with 96 different combinations of the three attributes, and using statistical analysis, such as conjoint analysis, t-test analysis and Kendall analysis we identified that the most influential factor to the online consumer decision making process is brand, the second most important attribute is the colour, which was estimated half as important as brand, and the least important attribute is the position on the screen. Additionally, we identified the main differences regarding consumers stated and revealed preferences regarding these three attributes.

  2. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  3. Rate transient analysis for homogeneous and heterogeneous gas reservoirs using the TDS technique

    International Nuclear Information System (INIS)

    Escobar, Freddy Humberto; Sanchez, Jairo Andres; Cantillo, Jose Humberto

    2008-01-01

    In this study pressure test analysis in wells flowing under constant wellbore flowing pressure for homogeneous and naturally fractured gas reservoir using the TDS technique is introduced. Although, constant rate production is assumed in the development of the conventional well test analysis methods, constant pressure production conditions are sometimes used in the oil and gas industry. The constant pressure technique or rate transient analysis is more popular reckoned as decline curve analysis under which rate is allows to decline instead of wellbore pressure. The TDS technique, everyday more used even in the most recognized software packages although without using its trade brand name, uses the log-log plot to analyze pressure and pressure derivative test data to identify unique features from which exact analytical expression are derived to easily estimate reservoir and well parameters. For this case, the fingerprint characteristics from the log-log plot of the reciprocal rate and reciprocal rate derivative were employed to obtain the analytical expressions used for the interpretation analysis. Many simulation experiments demonstrate the accuracy of the new method. Synthetic examples are shown to verify the effectiveness of the proposed methodology

  4. A factor analysis to find critical success factors in retail brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in city of Tehran, Iran. Using a sample of 265 people from regular customers, the study uses factor analysis and extracts four main factors including related brand, product benefits, customer welfare strategy and corporate profits using the existing 31 factors in the literature.

  5. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  6. Factors that impact the outcome of endoscopic correction of vesicoureteral reflux: a multivariate analysis.

    Science.gov (United States)

    Kajbafzadeh, Abdol-Mohammad; Tourchi, Ali; Aryan, Zahra

    2013-02-01

    To identify independent factors that may predict vesicoureteral reflux (VUR) resolution after endoscopic treatment using dextranomer/hyaluronic acid copolymer (Deflux) in children free of anatomical anomalies. A retrospective study was conducted in our pediatric referral center from 1998 to 2011 on children with primary VUR who underwent endoscopic injection of Deflux with or without concomitant autologous blood injection (called HABIT or HIT, respectively). Children with secondary VUR or incomplete records were excluded from the study. Potential factors were divided into three categories including preoperative, intraoperative and postoperative. Success was defined as no sign of VUR on postoperative voiding cystourethrogram. Univariate and multivariate logistic regression models were constructed to identify independent factors that may predict success. Odds ratio (OR) and 95 % confidence interval (95 % CI) for prediction of success were estimated for each factor. From 485 children received Deflux injection, a total of 372 with a mean age of 3.10 years (ranged from 6 months to 12 years) were included in the study and endoscopic management was successful in 322 (86.6 %) of them. Of the patients, 185 (49.7 %) underwent HIT and 187 (50.3 %) underwent HABIT technique. On univariate analysis, VUR grade from preoperative category (OR = 4.79, 95 % CI = 2.22-10.30, p = 0.000), operation technique (OR = 0.33, 95 % CI = 0.17-0.64, p = 0.001) and presence of mound on postoperative sonography (OR = 0.06, 95 % CI = 0.02-0.16, p = 0.000) were associated with success. On multivariate analysis, preoperative VUR grade (OR = 4.85, 95 % CI = 2.49-8.96, p = 0.000) and identification of mound on postoperative sonography (OR = 0.07, 95 % CI = 0.01-0.18, p = 0.000) remained as independent success predictors. Based on this study, successful VUR correction after the endoscopic injection of Deflux can be predicted with respect to preoperative VUR grade and presence of mound after operation.

  7. Potential barriers to the application of multi-factor portfolio analysis in public hospitals: evidence from a pilot study in the Netherlands.

    Science.gov (United States)

    Pavlova, Milena; Tsiachristas, Apostolos; Vermaeten, Gerhard; Groot, Wim

    2009-01-01

    Portfolio analysis is a business management tool that can assist health care managers to develop new organizational strategies. The application of portfolio analysis to US hospital settings has been frequently reported. In Europe however, the application of this technique has received little attention, especially concerning public hospitals. Therefore, this paper examines the peculiarities of portfolio analysis and its applicability to the strategic management of European public hospitals. The analysis is based on a pilot application of a multi-factor portfolio analysis in a Dutch university hospital. The nature of portfolio analysis and the steps in a multi-factor portfolio analysis are reviewed along with the characteristics of the research setting. Based on these data, a multi-factor portfolio model is developed and operationalized. The portfolio model is applied in a pilot investigation to analyze the market attractiveness and hospital strengths with regard to the provision of three orthopedic services: knee surgery, hip surgery, and arthroscopy. The pilot portfolio analysis is discussed to draw conclusions about potential barriers to the overall adoption of portfolio analysis in the management of a public hospital. Copyright (c) 2008 John Wiley & Sons, Ltd.

  8. Current applications of miniaturized chromatographic and electrophoretic techniques in drug analysis.

    Science.gov (United States)

    Aturki, Zeineb; Rocco, Anna; Rocchi, Silvia; Fanali, Salvatore

    2014-12-01

    In the last decade, miniaturized separation techniques have become greatly popular in pharmaceutical analysis. Miniaturized separation methods are increasingly utilized in all processes of drug discovery as well as quality control of pharmaceutical preparation. The great advantages presented by the analytical miniaturized techniques, including high separation efficiency and resolution, rapid analysis and minimal consumption of reagents and samples, make them an attractive alternative to the conventional chromatographic methods for drug analysis. The purpose of this review is to give a general overview of the applicability of capillary electrophoresis (CE), capillary electrochromatography (CEC) and micro/capillary/nano-liquid chromatography (micro-LC/CLC/nano-LC) for the analysis of pharmaceutical formulations, active pharmaceutical ingredients (API), drug impurity testing, chiral drug separation, determination of drugs and metabolites in biological fluids. The results concerning the use of CEC, micro-LC, CLC, and nano-LC in the period 2009-2013, while for CE, those from 2012 up to the review draft are here summarized and some specific examples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  10. A novel graphical technique for Pinch Analysis applications: Energy Targets and grassroots design

    International Nuclear Information System (INIS)

    Gadalla, Mamdouh A.

    2015-01-01

    Graphical abstract: A new HEN graphical design. - Highlights: • A new graphical technique for heat exchanger networks design. • Pinch Analysis principles and design rules are better interpreted. • Graphical guidelines for optimum heat integration. • New temperature-based graphs provide user-interactive features. - Abstract: Pinch Analysis is for decades a leading tool to energy integration for retrofit and design. This paper presents a new graphical technique, based on Pinch Analysis, for the grassroots design of heat exchanger networks. In the new graph, the temperatures of hot streams are plotted versus those of the cold streams. The temperature–temperature based graph is constructed to include temperatures of hot and cold streams as straight lines, horizontal lines for hot streams, and vertical lines for cold streams. The graph is applied to determine the pinch temperatures and Energy Targets. It is then used to synthesise graphically a complete exchanger network, achieving the Energy Targets. Within the new graph, exchangers are represented by inclined straight lines, whose slopes are proportional to the ratio of heat capacities and flows. Pinch Analysis principles for design are easily interpreted using this new graphical technique to design a complete exchanger network. Network designs achieved by the new technique can guarantee maximum heat recovery. The new technique can also be employed to simulate basic designs of heat exchanger networks. The strengths of the new tool are that it is simply applied using computers, requires no commercial software, and can be used for academic purposes/engineering education

  11. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  12. Technique for determining cesium-137 in milk excluding ashing

    International Nuclear Information System (INIS)

    Antonova, V.A.; Prokof'ev, O.N.

    1984-01-01

    For the purpose of simplification of the method of milk sample preparation for the radiochemical analysis for 137 Cs in laboratory sanitary and higienic investigations rapid analysis technique excluding sample ashing is developed. The technique comprises sample mixing with silica gel during an hour, 137 Cs desorption from silica gel into the 1N solution of hydrochloric acid, radiochemical analysis of hydrochloric solution for determining 137 Cs. The comparison of the results obtained by the above method and that with ashing provides perfect agreement. For taking into account the incompleteness of 137 Cs adsorption by silica gel a correction factor is applied in calculation formulae

  13. Use of the neutron activation technique: soil-plant transfer factor

    International Nuclear Information System (INIS)

    Silva, Wellington Ferrari da; Menezes, Maria Ângela de B.C.; Marques, Douglas José

    2017-01-01

    Recent studies have demonstrated the importance of the soil-plant transfer factor in the absorption and translocation of chemical elements, thus, it is possible to evaluate a better decision-making in the consecutive plantations. To determine these values, the content of a chemical element present in the plant or part of it with the total content present in the same soil where it is grown is considered. The objective of this study was to determine the concentrations of the chemical elements present in soil, leaf and grains corn, by neutron activation analysis and to compare the different soil-plant transfer factors. The samples were collected in a property located in the region of Biquinhas, MG, and irradiated in the TRIGA MARK I IPR-R1 CDTN / CNEN nuclear reactor. Thus, the concentrations of Br, Ce Fe, K, La, Na, Rb, Zn were determined. The soil-plant transfer factors for the elements found were varied, indicating a greater potassium absorption capacity (K). (author)

  14. Use of the neutron activation technique: soil-plant transfer factor

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Wellington Ferrari da, E-mail: wferrari250@yahoo.com.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Programa de Pós-Graduação em Ciências e Técnicas Nucleares; Menezes, Maria Ângela de B.C., E-mail: menezes@cdtn.br [Centro Desenvolvimento da Tecnologia Nuclear (SERTA/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Serviço de Técnicas Analíticas. Laboratório de Ativação Neutrônica; Marques, Douglas José, E-mail: douglasjmarques81@yahoo.com.br [Universidade José do Rosário Vellano, Alfenas, MG (Brazil). Setor de Olericultura e Experimentação em Agricultura Orgânica

    2017-07-01

    Recent studies have demonstrated the importance of the soil-plant transfer factor in the absorption and translocation of chemical elements, thus, it is possible to evaluate a better decision-making in the consecutive plantations. To determine these values, the content of a chemical element present in the plant or part of it with the total content present in the same soil where it is grown is considered. The objective of this study was to determine the concentrations of the chemical elements present in soil, leaf and grains corn, by neutron activation analysis and to compare the different soil-plant transfer factors. The samples were collected in a property located in the region of Biquinhas, MG, and irradiated in the TRIGA MARK I IPR-R1 CDTN / CNEN nuclear reactor. Thus, the concentrations of Br, Ce Fe, K, La, Na, Rb, Zn were determined. The soil-plant transfer factors for the elements found were varied, indicating a greater potassium absorption capacity (K). (author)

  15. Multi-criterion analysis technique in a process of quality management

    OpenAIRE

    A. Gwiazda

    2007-01-01

    Purpose: The aim of this paper is to present the critical analysis of some multi-criteria techniques applied in the area of quality management. It is strongly stated that some solutions in this scientific area characterizes the non-methodological approaches.Design/methodology/approach: The research methodology, in presented work, has been based on the theoretical analysis of the quality tools management and on the empirical researches.Findings: The proposals of improvement the main quality to...

  16. Nonlinear analysis techniques for use in the assessment of high-level waste tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-01-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks

  17. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  18. Factor analysis for exercise stress radionuclide ventriculography

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Yasuda, Mitsutaka; Oku, Hisao; Ikuno, Yoshiyasu; Takeuchi, Kazuhide; Takeda, Tadanao; Ochi, Hironobu

    1987-01-01

    Using factor analysis, a new image processing in exercise stress radionuclide ventriculography, changes in factors associated with exercise were evaluated in 14 patients with angina pectoris or old myocardial infarction. The patients were imaged in the left anterior oblique projection, and three factor images were presented on a color coded scale. Abnormal factors (AF) were observed in 6 patients before exercise, 13 during exercise, and 4 after exercise. In 7 patients, the occurrence of AF was associated with exercise. Five of them became free from AF after exercise. Three patients showing AF before exercise had aggravation of AF during exercise. Overall, the occurrence or aggravation of AF was associated with exercise in ten (71 %) of the patients. The other three patients, however, had disappearance of AF during exercise. In the last patient, none of the AF was observed throughout the study. In view of a high incidence of AF associated with exercise, the factor analysis may have the potential in evaluating cardiac reverse from the viewpoint of left ventricular wall motion abnormality. (Namekawa, K.)

  19. The Thoron Issue: Monitoring Activities, Measuring Techniques and Dose Conversion Factors (invited paper)

    International Nuclear Information System (INIS)

    Nuccetelli, C.; Bochicchio, F.

    1998-01-01

    The health risk due to the presence of thoron indoors is usually neglected because of its generally low concentration in indoor environments, which is essentially caused by its short half-life. However, in certain not uncommon situations, such as when thorium-rich building materials are used, thoron ( 220 Rn) may represent a significant source of radioactive exposure. In recent years, renewed interest has led to more intensive monitoring of thoron gas and its decay products. A tentatively comprehensive summary of these measurement results and a review of the most innovative measurement techniques for 220 Rn are here presented. Finally, dose-exposure conversion factors currently used for thoron decay products are analysed, highlighting the poorer basis of such factors, when compared to those for radon. (author)

  20. Analysis of corrosion-product transport using nondestructive XRF and MS techniques

    International Nuclear Information System (INIS)

    Sawicka, B.D.; Sawicki, J.A.

    1998-01-01

    This paper describes the application of X-ray fluorescence (XRF) and Moessbauer spectroscopy (MS) techniques to monitor corrosion-product transport (CPT) in water circuits of nuclear reactors. The combination of XRF and MS techniques was applied in studies of CPT crud filters from both primary- and secondary-side water circuits (i.e., radioactive and nonradioactive specimens) of CANDU reactors. The XRF-MS method allows nondestructive analysis of species collected on filters and provides more complete information about corrosion products than commonly used digestive methods of chemical analysis. Recent analyses of CPT specimens from the Darlington Nuclear Generating Station (NGS) primary side and the Bruce B NGS feedwater system are shown as examples. Some characteristics of primary and secondary water circuits are discussed using these new data. (author)

  1. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    Science.gov (United States)

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  2. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  3. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  4. Analysis of Factors Related to Hypopituitarism in Patients with Nonsellar Intracranial Tumor.

    Science.gov (United States)

    Lu, Song-Song; Gu, Jian-Jun; Luo, Xiao-Hong; Zhang, Jian-He; Wang, Shou-Sen

    2017-09-01

    Previous studies have suggested that postoperative hypopituitarism in patients with nonsellar intracranial tumors is caused by traumatic surgery. However, with development of minimally invasive and precise neurosurgical techniques, the degree of injury to brain tissue has been reduced significantly, especially for parenchymal tumors. Therefore, understanding preexisting hypopituitarism and related risk factors can improve perioperative management for patients with nonsellar intracranial tumors. Chart data were collected retrospectively from 83 patients with nonsellar intracranial tumors admitted to our hospital from May 2014 to April 2015. Pituitary function of each subject was determined based on results of preoperative serum pituitary hormone analysis. Univariate and multivariate logistic regression methods were used to analyze relationships between preoperative hypopituitarism and factors including age, sex, history of hypertension and secondary epilepsy, course of disease, tumor mass effect, site of tumor, intracranial pressure (ICP), cerebrospinal fluid content, and pituitary morphology. A total of 30 patients (36.14%) presented with preoperative hypopituitarism in either 1 axis or multiple axes; 23 (27.71%) were affected in 1 axis, and 7 (8.43%) were affected in multiple axes. Univariate analysis showed that risk factors for preoperative hypopituitarism in patients with a nonsellar intracranial tumor include an acute or subacute course (≤3 months), intracranial hypertension (ICP >200 mm H 2 O), and mass effect (P hypopituitarism in patients with nonsellar intracranial tumors (P hypopituitarism is high in patients with nonsellar intracranial tumors. The occurrence of hypopituitarism is correlated with factors including an acute or subacute course (≤3 months), intracranial hypertension (ICP >200 mm H 2 O), and mass effect (P hypopituitarism. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Confirmatory Factor Analysis in the study of the Structure and Stability of Assessment Instruments: An example with the Self-Esteem Questionnaire (CA-14

    Directory of Open Access Journals (Sweden)

    Juan Herrero

    2010-12-01

    Full Text Available This research presents a study of the factor structure and temporal stability of the Self-Esteem Questionnaire (CA-14 using Confirmatory Factor Analysis (CFA techniques. The paper pretends to offer a potential guide to researchers, paying special attention to data requirements, estimation methods suggested in the literature, recommended fit indices and other circumstances that need to be taken into account when estimating CFA models. Also, various methodological strategies are shown during the implementation of CFA models: correlated errors, use of parameter constraints, multigroup analysis, etc.

  6. A Total Factor Productivity Toolbox for MATLAB

    NARCIS (Netherlands)

    B.M. Balk (Bert); J. Barbero (Javier); J.L. Zofío (José)

    2018-01-01

    textabstractTotal Factor Productivity Toolbox is a new package for MATLAB that includes functions to calculate the main Total Factor Productivity (TFP) indices and their decompositions, based on Shephard’s distance functions and using Data Envelopment Analysis (DEA) programming techniques. The

  7. Hand function evaluation: a factor analysis study.

    Science.gov (United States)

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  8. Salivary SPECT and factor analysis in Sjoegren's syndrome

    International Nuclear Information System (INIS)

    Nakamura, T.; Oshiumi, Y.; Yonetsu, K.; Muranaka, T.; Sakai, K.; Kanda, S.; National Fukuoka Central Hospital

    1991-01-01

    Salivary SPECT and factor analysis in Sjoegren's syndrome were performed in 17 patients and 6 volunteers as controls. The ability of SPECT to detect small differences in the level of uptake can be used to separate glands from background even when uptake is reduced as in the patients with Sjoegren's syndrome. In control and probable Sjoegren's syndrome groups the uptake ratio of the submandibular gland to parotid gland on salivary SPECT (S/P ratio) was less than 1.0. However, in the definite Sjoergren's syndrome group, the ratio was more than 1.0. Moreover, the ratio in all patients with sialectasia, which is characteristic of Sjoegren's syndrome, was more than 1.0. Salivary factor analysis of normal parotid glands showed slowly increasing patterns of uptake and normal submandibular glands had rapidly increasing patterns of uptake. However, in the definite Sjoegren's syndrome group, the factor analysis patterns were altered, with slowly increasing patterns dominating both in the parotid and submandibular glands. These results suggest that the S/P ratio in salivary SPECT and salivary factor analysis provide additional radiologic criteria in diagnosing Sjoegren's syndrome. (orig.)

  9. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    Science.gov (United States)

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  11. Immunoautoradiographic analysis of epidermal growth factor receptors: a sensitive method for the in situ identification of receptor proteins and for studying receptor specificity

    International Nuclear Information System (INIS)

    Fernandez-Pol, J.A.

    1982-01-01

    The use of an immunoautoradiographic system for the detection and analysis of epidermal growth factor (EGF) receptors in human epidermoid carcinoma A-431 cells is reported. By utilizing this technique, the interaction between EGF and its membrane receptor in A-431 cells can be rapidly visualized. The procedure is simple, rapid, and very sensitive, and it provides conclusive evidence that the 150K dalton protein is the receptor fo EGF in A-431 cells. In summary, the immunoautoradiographic procedure brings to the analysis of hormone rceptor proteins the power that the radioimmunoassay technique has brought to the analysis of hormones. Thus, this assay system is potentially applicable in a wide spectrum in many fields of nuclear medicine and biology

  12. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  13. Techniques for the quantitative analysis of fission-product noble metals

    International Nuclear Information System (INIS)

    Lautensleger, A.W.; Hara, F.T.

    1982-08-01

    Analytical procedures for the determination of ruthenium, rhodium, and palladium in precursor waste, solvent metal, and final glass waste forms have been developed. Two procedures for the analysis of noble metals in the calcine and glass waste forms are described in this report. The first is a fast and simple technique that combines inductively coupled argon plasma atomic emission spectrometry (ICP) and x-ray fluorescence techniques and can only be used on nonradioactive materials. The second procedure is based on a noble metal separation step, followed by an analysis using ICP. This second method is more complicated than the first, but it will work on radioactive materials. Also described is a procedure for the ICP analysis of noble metals in the solvent metal matrix. The only solvent metal addressed in this procedure is lead, but with minor changes the procedure could be applied to any of the solvent metals being considered in the Pacific Northwest Laboratory (PNL) extraction process. A brief explanation of atomic spectroscopy and the ICP analytical process, as well as of certain aspects of ICP performance (interelement spectral line interferences and certain matrix effects) is given

  14. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  15. Fault Tree Analysis with Temporal Gates and Model Checking Technique for Qualitative System Safety Analysis

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2010-01-01

    Fault tree analysis (FTA) has suffered from several drawbacks such that it uses only static gates and hence can not capture dynamic behaviors of the complex system precisely, and it is in lack of rigorous semantics, and reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and time-consuming for the complex systems while it has been one of the most widely used safety analysis technique in nuclear industry. Although several attempts have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA

  16. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  17. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  18. Determining The Factors Affecting Fruit Hardness of Different Peach Types with Meta Analysis

    Directory of Open Access Journals (Sweden)

    Hande Küçükönder

    2014-09-01

    Full Text Available The aim of this study is to determine the factor effective in determining the hardness of Caterina, Suidring, Royal Glory and Tirrenia peach types using meta analysis. In the study, the impact force (Fi and the contact time (tc were detected and the impulse values (I that are expressed as independent variable in the area under the curve were calculated in the measurements performed using the technique of a low-mass lateral impactor multiplicated with peach. Using the theory of elasticity, the independent variables were determined as Fmax (maximum impact force, contact time (tmax, Fmax/tmax, 1/tmax, 1/tmax2,5, Fmax/tmax 1.25 and Fmax2.5 parameters. The correlation coefficient values showing the relationship between these parameters and the dependent variable Magness-Taylor force (MT were calculated and were combined with meta-analysis by using the Hunter-Schmid and Fisher’s Z methods. The Cohen’s classification criterion was used in evaluating the resulting mean effect size (combined correlation value and in determining its direction. As a result of the meta-analysis, the mean effect size according to Hunter-Schmid method was found 0.436 (0.371-0.497 positively directed in 95% confidence interval, while it was found 0.468 (0.390-0.545 according to Fisher’s Z method. The effect sizes in both methods were determined “mid-level” according to the Cohen’s classification. When the significance level of the studies was analyzed with the Z test, all of the ones that taken into the meta analysis has been found statistically significant. As a result of the meta analysis in this study evaluating the relationship of peach types with the fruit hardness, the mean effect size has been found to reach “strong level”. Consequently, “maximum shock acceleration” was found to be a more effective factor comparing to the other factors in determining the the fruit hardness according to the results of meta analysis applied in both methods.

  19. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  20. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  1. Is There an Economical Running Technique? A Review of Modifiable Biomechanical Factors Affecting Running Economy.

    Science.gov (United States)

    Moore, Isabel S

    2016-06-01

    Running economy (RE) has a strong relationship with running performance, and modifiable running biomechanics are a determining factor of RE. The purposes of this review were to (1) examine the intrinsic and extrinsic modifiable biomechanical factors affecting RE; (2) assess training-induced changes in RE and running biomechanics; (3) evaluate whether an economical running technique can be recommended and; (4) discuss potential areas for future research. Based on current evidence, the intrinsic factors that appeared beneficial for RE were using a preferred stride length range, which allows for stride length deviations up to 3 % shorter than preferred stride length; lower vertical oscillation; greater leg stiffness; low lower limb moment of inertia; less leg extension at toe-off; larger stride angles; alignment of the ground reaction force and leg axis during propulsion; maintaining arm swing; low thigh antagonist-agonist muscular coactivation; and low activation of lower limb muscles during propulsion. Extrinsic factors associated with a better RE were a firm, compliant shoe-surface interaction and being barefoot or wearing lightweight shoes. Several other modifiable biomechanical factors presented inconsistent relationships with RE. Running biomechanics during ground contact appeared to play an important role, specifically those during propulsion. Therefore, this phase has the strongest direct links with RE. Recurring methodological problems exist within the literature, such as cross-comparisons, assessing variables in isolation, and acute to short-term interventions. Therefore, recommending a general economical running technique should be approached with caution. Future work should focus on interdisciplinary longitudinal investigations combining RE, kinematics, kinetics, and neuromuscular and anatomical aspects, as well as applying a synergistic approach to understanding the role of kinetics.

  2. An Evaluation of the Critical Factors Affecting the Efficiency of Some Sorting Techniques

    OpenAIRE

    Olabiyisi S.O.; Adetunji A.B.; Oyeyinka F.I.

    2013-01-01

    Sorting allows information or data to be put into a meaningful order. As efficiency is a major concern of computing, data are sorted in order to gain the efficiency in retrieving or searching tasks. The factors affecting the efficiency of shell, Heap, Bubble, Quick and Merge sorting techniques in terms of running time, memory usage and the number of exchanges were investigated. Experiment was conducted for the decision variables generated from algorithms implemented in Java programming and fa...

  3. A comparison of autonomous techniques for multispectral image analysis and classification

    Science.gov (United States)

    Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso

    2012-10-01

    Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.

  4. Trace elements in lake sediments measured by the PIXE technique

    International Nuclear Information System (INIS)

    Gatti, Luciana V.; Mozeto, Antonio A.; Artaxo, Paulo

    1999-01-01

    Lakes are ecosystems where there is a great potential of metal accumulation in sediments due to their depositional characteristics. Total concentration of trace elements was measured on a 50 cm long sediment core from the Infernao Lake, that is an oxbow lake of the Moji-Guacu River basin, in the state of Sao Paulo, Brazil. Dating of the core shows up to 180 yrs old sediment layers. The use of the PIXE technique for elemental analysis avoids the traditional acid digestion procedure common in other techniques. The multielemental characteristic of PIXE allows a simultaneous determination of about 20 elements in the sediment samples, such as, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Sr, Zr, Ba, and Pb. Average values for the elemental composition were found to be similar to the bulk crustal composition. The lake flooding pattern strongly influences the time series of the elemental profiles. Factor analysis of the elemental variability shows five factors. Two of the factors represent the mineralogical matrix, and others represent the organic component, a factor with lead, and another loaded with chromium. The mineralogical component consists of elements such as, Fe, Al, V, Ti, Mn, Ni, K, Zr, Sr, Cu and Zn. The variability of Si is explained by two distinct factors, because it is influenced by two different sources, aluminum-silicates and quartz, and the effect of inundation are different for each other. The organic matter is strongly associated with calcium, and also bounded with S, Zn, Cu and P. Lead and chromium appears as separated factors, although it is not clear the evidences for their anthropogenic origin. The techniques developed for sample preparation and PIXE analysis was proven as advantageous and provided very good reproducibility and accuracy

  5. Factor analysis and psychometric properties of the Mother-Adolescent Sexual Communication (MASC) instrument for sexual risk behavior.

    Science.gov (United States)

    Cox, Mary Foster; Fasolino, Tracy K; Tavakoli, Abbas S

    2008-01-01

    Sexual risk behavior is a public health problem among adolescents living at or below poverty level. Approximately 1 million pregnancies and 3 million cases of sexually transmitted infections (STIs) are reported yearly. Parenting plays a significant role in adolescent behavior, with mother-adolescent sexual communication correlated with absent or delayed sexual behavior. This study developed an instrument examining constructs of mother-adolescent communication, the Mother-Adolescent Sexual Communication (MASC) instrument. A convenience sample of 99 mothers of middle school children completed the self-administered questionnaires. The original 34-item MASC was reduced to 18 items. Exploratory factor analysis was conducted on the 18-item scale, which resulted in four factors explaining 84.63% of the total variance. Internal consistency analysis produced Cronbach alpha coefficients of .87, .90, .82, and .71 for the four factors, respectively. Convergent validity via hypothesis testing was supported by significant correlations with several subscales of the Parent-Child Relationship Questionnaire (PCRQ) with MASC factors, that is, content and style factors with warmth, personal relationships and disciplinary warmth subscales of the PCRQ, the context factor with personal relationships, and the timing factor with warmth. In light of these findings, the psychometric characteristics and multidimensional perspective of the MASC instrument show evidence of usefulness for measuring and advancing knowledge of mother and adolescent sexual communication techniques.

  6. Using exploratory factor analysis in personality research: Best-practice recommendations

    Directory of Open Access Journals (Sweden)

    Sumaya Laher

    2010-11-01

    Research purpose: This article presents more objective methods to determine the number of factors, most notably parallel analysis and Velicer’s minimum average partial (MAP. The benefits of rotation are also discussed. The article argues for more consistent use of Procrustes rotation and congruence coefficients in factor analytic studies. Motivation for the study: Exploratory factor analysis is often criticised for not being rigorous and objective enough in terms of the methods used to determine the number of factors, the rotations to be used and ultimately the validity of the factor structure. Research design, approach and method: The article adopts a theoretical stance to discuss the best-practice recommendations for factor analytic research in the field of psychology. Following this, an example located within personality assessment and using the NEO-PI-R specifically is presented. A total of 425 students at the University of the Witwatersrand completed the NEO-PI-R. These responses were subjected to a principal components analysis using varimax rotation. The rotated solution was subjected to a Procrustes rotation with Costa and McCrae’s (1992 matrix as the target matrix. Congruence coefficients were also computed. Main findings: The example indicates the use of the methods recommended in the article and demonstrates an objective way of determining the number of factors. It also provides an example of Procrustes rotation with coefficients of agreement as an indication of how factor analytic results may be presented more rigorously in local research. Practical/managerial implications: It is hoped that the recommendations in this article will have best-practice implications for both researchers and practitioners in the field who employ factor analysis regularly. Contribution/value-add: This article will prove useful to all researchers employing factor analysis and has the potential to set the trend for better use of factor analysis in the South African context.

  7. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  8. Development of a computer code 'CRACK' for elastic and elastoplastic fracture mechanics analysis of 2-D structures by finite element technique

    International Nuclear Information System (INIS)

    Dutta, B.K.; Kakodkar, A.; Maiti, S.K.

    1986-01-01

    The fracture mechanics analysis of nuclear components is required to ensure prevention of sudden failure due to dynamic loadings. The linear elastic analysis near to a crack tip shows presence of stress singularity at the crack tip. The simulation of this singularity in numerical methods enhance covergence capability. In finite element technique this can be achieved by placing mid nodes of 8 noded or 6 noded isoparametric elements, at one fourth ditance from crack tip. Present report details this characteristic of finite element, implementation of this element in a code 'CRACK', implementation of J-integral to compute stress intensity factor and solution of number of cases for elastic and elastoplastic fracture mechanics analysis. 6 refs., 6 figures. (author)

  9. Possibilities to employ noise analysis techniques in controlling nuclear power stations

    International Nuclear Information System (INIS)

    Alfonso Pallares, C.; Iglesias Ferrer, R.; Sarabia Molina, I.

    1998-01-01

    This work shows basic requirements the authors think must be complied with by monitoring systems for operational surveillance based on noise analysis techniques that in turn can be employed in the regulatory control

  10. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    Science.gov (United States)

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  11. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  12. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  14. Data Collection and Analysis Techniques for Evaluating the Perceptual Qualities of Auditory Stimuli

    Energy Technology Data Exchange (ETDEWEB)

    Bonebright, T.L.; Caudell, T.P.; Goldsmith, T.E.; Miner, N.E.

    1998-11-17

    This paper describes a general methodological framework for evaluating the perceptual properties of auditory stimuli. The framework provides analysis techniques that can ensure the effective use of sound for a variety of applications including virtual reality and data sonification systems. Specifically, we discuss data collection techniques for the perceptual qualities of single auditory stimuli including identification tasks, context-based ratings, and attribute ratings. In addition, we present methods for comparing auditory stimuli, such as discrimination tasks, similarity ratings, and sorting tasks. Finally, we discuss statistical techniques that focus on the perceptual relations among stimuli, such as Multidimensional Scaling (MDS) and Pathfinder Analysis. These methods are presented as a starting point for an organized and systematic approach for non-experts in perceptual experimental methods, rather than as a complete manual for performing the statistical techniques and data collection methods. It is our hope that this paper will help foster further interdisciplinary collaboration among perceptual researchers, designers, engineers, and others in the development of effective auditory displays.

  15. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  16. Factor Analysis for Clustered Observations.

    Science.gov (United States)

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  17. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  19. Analysis and optimization of the TWINKLE factoring device

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Preneel, B.

    2000-01-01

    We describe an enhanced version of the TWINKLE factoring device and analyse to what extent it can be expected to speed up the sieving step of the Quadratic Sieve and Number Field Sieve factoring al- gorithms. The bottom line of our analysis is that the TWINKLE-assisted factorization of 768-bit

  20. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  1. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  2. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound sediments

    Directory of Open Access Journals (Sweden)

    Morgana Camacho

    2013-04-01

    Full Text Available Parasite findings in sambaquis (shell mounds are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  3. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    Science.gov (United States)

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  4. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  5. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  6. LC-NMR Technique in the Analysis of Phytosterols in Natural Extracts

    Czech Academy of Sciences Publication Activity Database

    Horník, Štěpán; Sajfrtová, Marie; Karban, Jindřich; Sýkora, Jan; Březinová, Anna; Wimmer, Zdeněk

    2013-01-01

    Roč. 2013, č. 2013 (2013), ID 526818 ISSN 2090-8865 R&D Projects: GA TA ČR TA01010578 Grant - others:GA ČR(CZ) GAP503/11/0616 Program:GA Institutional support: RVO:67985858 ; RVO:61388963 ; RVO:61389030 Keywords : LC-NMR technique * phytosterols * analyses Subject RIV: CC - Organic Chemistry; EE - Microbiology, Virology (UEB-Q) Impact factor: 0.948, year: 2013

  7. Exploring the potential of data mining techniques for the analysis of accident patterns

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Bekhor, Shlomo; Galtzur, Ayelet

    2010-01-01

    Research in road safety faces major challenges: individuation of the most significant determinants of traffic accidents, recognition of the most recurrent accident patterns, and allocation of resources necessary to address the most relevant issues. This paper intends to comprehend which data mining...... and association rules) data mining techniques are implemented for the analysis of traffic accidents occurred in Israel between 2001 and 2004. Results show that descriptive techniques are useful to classify the large amount of analyzed accidents, even though introduce problems with respect to the clear...... importance of input and intermediate neurons, and the relative importance of hundreds of association rules. Further research should investigate whether limiting the analysis to fatal accidents would simplify the task of data mining techniques in recognizing accident patterns without the “noise” probably...

  8. Comparison of heuristic optimization techniques for the enrichment and gadolinia distribution in BWR fuel lattices and decision analysis

    International Nuclear Information System (INIS)

    Castillo, Alejandro; Martín-del-Campo, Cecilia; Montes-Tadeo, José-Luis; François, Juan-Luis; Ortiz-Servin, Juan-José; Perusquía-del-Cueto, Raúl

    2014-01-01

    Highlights: • Different metaheuristic optimization techniques were compared. • The optimal enrichment and gadolinia distribution in a BWR fuel lattice was studied. • A decision making tool based on the Position Vector of Minimum Regret was applied. • Similar results were found for the different optimization techniques. - Abstract: In the present study a comparison of the performance of five heuristic techniques for optimization of combinatorial problems is shown. The techniques are: Ant Colony System, Artificial Neural Networks, Genetic Algorithms, Greedy Search and a hybrid of Path Relinking and Scatter Search. They were applied to obtain an “optimal” enrichment and gadolinia distribution in a fuel lattice of a boiling water reactor. All techniques used the same objective function for qualifying the different distributions created during the optimization process as well as the same initial conditions and restrictions. The parameters included in the objective function are the k-infinite multiplication factor, the maximum local power peaking factor, the average enrichment and the average gadolinia concentration of the lattice. The CASMO-4 code was used to obtain the neutronic parameters. The criteria for qualifying the optimization techniques include also the evaluation of the best lattice with burnup and the number of evaluations of the objective function needed to obtain the best solution. In conclusion all techniques obtain similar results, but there are methods that found better solutions faster than others. A decision analysis tool based on the Position Vector of Minimum Regret was applied to aggregate the criteria in order to rank the solutions according to three functions: neutronic grade at 0 burnup, neutronic grade with burnup and global cost which aggregates the computing time in the decision. According to the results Greedy Search found the best lattice in terms of the neutronic grade at 0 burnup and also with burnup. However, Greedy Search is

  9. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  10. Artificial intelligence techniques for automatic screening of amblyogenic factors.

    Science.gov (United States)

    Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard

    2008-01-01

    To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the "gold standard" specialist examination with a "refer/do not refer" decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than -7. Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years.

  11. Advances in Proteomic Techniques for Cytokine Analysis: Focus on Melanoma Research

    Directory of Open Access Journals (Sweden)

    Helena Kupcova Skalnikova

    2017-12-01

    Full Text Available Melanoma is a skin cancer with permanently increasing incidence and resistance to therapies in advanced stages. Reports of spontaneous regression and tumour infiltration with T-lymphocytes makes melanoma candidate for immunotherapies. Cytokines are key factors regulating immune response and intercellular communication in tumour microenvironment. Cytokines may be used in therapy of melanoma to modulate immune response. Cytokines also possess diagnostic and prognostic potential and cytokine production may reflect effects of immunotherapies. The purpose of this review is to give an overview of recent advances in proteomic techniques for the detection and quantification of cytokines in melanoma research. Approaches covered span from mass spectrometry to immunoassays for single molecule detection (ELISA, western blot, multiplex assays (chemiluminescent, bead-based (Luminex and planar antibody arrays, ultrasensitive techniques (Singulex, Simoa, immuno-PCR, proximity ligation/extension assay, immunomagnetic reduction assay, to analyses of single cells producing cytokines (ELISpot, flow cytometry, mass cytometry and emerging techniques for single cell secretomics. Although this review is focused mainly on cancer and particularly melanoma, the discussed techniques are in general applicable to broad research field of biology and medicine, including stem cells, development, aging, immunology and intercellular communication.

  12. Study of factors affecting the productivity of nurses based on the ACHIEVE model and prioritizing them using analytic hierarchy process technique, 2012

    Directory of Open Access Journals (Sweden)

    Payam Farhadi

    2013-01-01

    Full Text Available Objective: Improving productivity is one of the most important strategies for social-economic development. Human resources are known as the most important resources in the organizations′ survival and success. Aims: To determine the factors affecting the human resource productivity using the ACHIEVEa model from the nurses′ perspective and then prioritize them from the perspective of head nurses using Analytic Hierarchy Process (AHP technique. Settings and Design: Iran, Shiraz University of Medical Sciences teaching hospitals in 2012. Materials and Methods: This was an applied, cross-sectional and analytical-descriptive study conducted in two phases. In the first phase, to determine the factors affecting the human resource productivity from nurses′ perspective, 110 nurses were selected using a two-stage cluster sampling method. Required data were collected using the Persian version of Hersey and Goldsmith′s Human Resource Productivity Questionnaire. In the second phase, in order to prioritize the factors affecting human resource productivity based on the ACHIEVE model using AHP technique, pairwise comparisons matrices were given to the 19 randomly selected head nurses to express their opinions about those factors relative priorities or importance. Statistical Analysis Used: Collected data and matrices in two mentioned phases were analyzed using SPSS 15.0 and some statistical tests including Independent-Samples T-Test and Pearson Correlation coefficient, as well as, Super Decisions software (Latest Beta. Results: The human resource productivity had significant relationships with nurses′ sex (P = 0.008, marital status (P < 0.001, education level (P < 0.001, and all questionnaire factors (P < 0.05. Nurses′ productivity from their perspective was below average (44.97 ΁ 7.43. Also, the priorities of factors affecting the productivity of nurses based on the ACHIEVE model from the head nurses′ perspective using AHP technique, from the

  13. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  14. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  15. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    Energy Technology Data Exchange (ETDEWEB)

    Dahing, Lahasen Normanshah [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia and Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia); Yahya, Redzuan [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia); Yahya, Roslan; Hassan, Hearie [Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia)

    2014-09-03

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm{sup 3} and 15×15×15 cm{sup 3} were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  16. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  17. Application of the neutron noise analysis technique in nuclear power plants

    International Nuclear Information System (INIS)

    Lescano, Victor H.; Wentzeis, Luis M.

    1999-01-01

    Using the neutron noise analysis in nuclear power plants, and without producing any perturbation in the normal operation of the plant, information of the vibration state of the reactor internals and the behavior of the operating conditions of the reactor primary circuit can be obtained. In Argentina, the neutron noise analysis technique is applied in customary way in the nuclear power plants Atucha I and Embalse. A database was constructed and vibration frequencies corresponding to different reactor internals were characterized. Reactor internals with particular mechanical vibrations have been detected and localized. In the framing of a cooperation project between Argentina and Germany, we participated in the measurements, analysis and modelisation, using the neutron noise technique, in the Obrigheim and Gundremmingen nuclear power plants. In the nuclear power plant Obrigheim (PWR, 350 M We), correlations between the signals measured from self-power neutron detectors and accelerometers located inside the reactor core, were made. In the nuclear power plant Gundremmingen (BWR, 1200 M We) we participated in the study of a particular mechanical vibration detected in one of the instrumentation tube. (author)

  18. "Factor Analysis Using ""R"""

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2013-02-01

    Full Text Available R (R Development Core Team, 2011 is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free and flexibility (its open-source. This article gives a general introduction to using R (i.e., loading the program, using functions, importing data. Then, using data from Canivez, Konold, Collins, and Wilson (2009, this article walks the user through how to use the program to conduct factor analysis, from both an exploratory and confirmatory approach.

  19. Assessing a new gene expression analysis technique for radiation biodosimetry applications

    Energy Technology Data Exchange (ETDEWEB)

    Manning, Grainne; Kabacik, Sylwia; Finnon, Paul; Paillier, Francois; Bouffler, Simon [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom); Badie, Christophe, E-mail: christophe.badie@hpa.org.uk [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom)

    2011-09-15

    The response to any radiation accident or incident involving actual or potential ionising radiation exposure requires accurate and rapid assessment of the doses received by individuals. The techniques available today for biodosimetry purposes are not fully adapted to rapid high-throughput measurements of exposures in large numbers of individuals. A recently emerging technique is based on gene expression analysis, as there are a number of genes which are radiation responsive in a dose-dependent manner. The present work aimed to assess a new technique which allows the detection of the level of expression of up to 800 genes without need of enzymatic reactions. In order to do so, human peripheral blood was exposed ex vivo to a range of x-ray doses from 5 mGy to 4 Gy of x-rays and the transcriptional expression of five radiation-responsive genes PHPT1, PUMA, CCNG1, DDB2 and MDM2 was studied by both the nCounter Digital Analyzer and Multiplex Quantitative Real-Time Polymerase Chain Reaction (MQRT-PCR) as the benchmark technology. Results from both techniques showed good correlation for all genes with R{sup 2} values ranging between 0.8160 and 0.9754. The reproducibility of the nCounter Digital Analyzer was also assessed in independent biological replicates and proved to be good. Although the slopes of the correlation of results obtained by the techniques suggest that MQRT-PCR is more sensitive than the nCounter Digital Analyzer, the nCounter Digital Analyzer provides sensitive and reliable data on modifications in gene expression in human blood exposed to radiation without enzymatic amplification of RNA prior to analysis.

  20. Comparison of techniques for determination of soluble sugars used in feed for ruminant nutrition

    Directory of Open Access Journals (Sweden)

    Cândida Camila dos Reis

    2015-02-01

    Full Text Available This study aims to evaluate different techniques for determination of soluble sugars (fraction CA in foods used for ruminant nutrition. Feed analyzed were: sugar-cane, bermuda grass, corn meal and soybean meal. Dry matter (DM, ash, ether extract (EE and crude protein (CP were determined to make possible the calculation of total carbohydrates concentration in the samples. The soluble carbohydrate fraction were determined in 15 repetitions of each sample by two different analytical techniques. One technique based on soluble carbohydrates extraction and its quantification by spectrophotometry after chelation of the sugar with anthrone; and another one that uses phenol as the chelating agent. The experiment were conducted in a completely randomized design and the data submitted to the two-factor factorial analysis of variance (? = 0.05, with the different feeds and the two techniques as factors. There was no statistical difference between techniques, but the effect of the feed and the interaction feed x technique were significant. So, a new analysis of variance was conducted to test the difference between techniques in each feed separately. Only soybean meal did not show statistical difference between the water soluble sugars determined by the two techniques.

  1. Analysis of Atorvastatin in Commercial Solid Drugs using the TT-PIGE Technique

    International Nuclear Information System (INIS)

    Younes, G; Zahraman, K; Nsouli, B; Bejjani, A; Mahmoud, R; El-Yazbi, F

    2008-01-01

    The quantification of the active ingredient (Al) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA techniques can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick Target PIGE technique for rapid and accurate quantification of low concentration AtorvastatinTM in three commercial anti-hyperlipidemic drugs (Lipitor, Liponorm and Storvas). (author)

  2. Analysis of Atorvastatin in Commercial Solid Drugs using the TT-PIGE Technique

    Energy Technology Data Exchange (ETDEWEB)

    Younes, G [Beirut Arab University, Faculty of Science, Chemistry Department Beirut (Lebanon); Zahraman, K; Nsouli, B; Bejjani, A [Lebanese Atomic Energy Commission, National Council for Scientific Research, Beirut (Lebanon); Mahmoud, R; El-Yazbi, F [Beirut Arab University, Faculty of Pharmacy, Department of Pharmaceutical and Analytical Chemistry, Beirut (Lebanon)

    2008-07-01

    The quantification of the active ingredient (Al) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA techniques can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the Thick Target PIGE technique for rapid and accurate quantification of low concentration AtorvastatinTM in three commercial anti-hyperlipidemic drugs (Lipitor, Liponorm and Storvas). (author)

  3. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    van Willigen, J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The

  4. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    Science.gov (United States)

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  5. Charting the trends in nuclear techniques for analysis of inorganic environmental pollutants

    International Nuclear Information System (INIS)

    Braun, T.

    1986-01-01

    Publications in Analytical Abstracts in the period 1975-1984 and papers presented at the Modern Trends in Activation Analysis international conferences series in the period 1961-1986 have been used as an empirical basis for assessing general trends in research and publication activity. Some ebbs and flows in the speciality of instrumental techniques for analysis of environmental trace pollutants are revealed by a statistical analysis of the publications. (author)

  6. Development of an optimization technique of CETOP-D inlet flow factor for reactor core thermal margin improvement

    International Nuclear Information System (INIS)

    Hong, Sung Duk; Im, Jong Sun; Yoo, Yun Jong; Kwon, Jung Taek; Park, Jong Ryool

    1995-01-01

    The recent ABB/CE(Asea Brown Boveri Combustion Engineering) type pressurized water reactors have the on-line monitoring system, i.e., the COLSS(core operating limit supervisory system), to prevent the specified acceptable fuel design limits from being violated during normal operation and anticipated operational occurrences. One of the main functions of COLSS is the on-line monitoring of the DNB(departure from nucleate boiling) overpower margin by calculating the MDNBR(minimum DNB ratio) for the measured operating condition at every second. The CETOP-D model, used in the MDNBR calculation of COLSS, is benchmarked conservatively against the TORC model using an inlet flow factor of hot assembly in CETOP-D as an adjustment factor for TORC. In this study, a technique to optimize the CETOP-D inlet flow factor has been developed by eliminating the excessive conservatism in the ABB/CE's. A correlation is introduced to account for the actual variation of the CETOP-D inlet flow factor within the core operating limits. This technique was applied to the core operating range of the Yonggwang Units 3 and 4 Cycle 1, which results in the increase of 2% in the DNB overpower margin at the normal operating condition, compared with that from the ABB/CE method. 7 figs., 2 tabs., 10 refs. (Author)

  7. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  8. Human factor analysis and preventive countermeasures in nuclear power plant

    International Nuclear Information System (INIS)

    Li Ye

    2010-01-01

    Based on the human error analysis theory and the characteristics of maintenance in a nuclear power plant, human factors of maintenance in NPP are divided into three different areas: human, technology, and organization. Which is defined as individual factors, including psychological factors, physiological characteristics, health status, level of knowledge and interpersonal skills; The technical factors including technology, equipment, tools, working order, etc.; The organizational factors including management, information exchange, education, working environment, team building and leadership management,etc The analysis found that organizational factors can directly or indirectly affect the behavior of staff and technical factors, is the most basic human error factor. Based on this nuclear power plant to reduce human error and measures the response. (authors)

  9. Multiscale analysis of damage using dual and primal domain decomposition techniques

    NARCIS (Netherlands)

    Lloberas-Valls, O.; Everdij, F.P.X.; Rixen, D.J.; Simone, A.; Sluys, L.J.

    2014-01-01

    In this contribution, dual and primal domain decomposition techniques are studied for the multiscale analysis of failure in quasi-brittle materials. The multiscale strategy essentially consists in decomposing the structure into a number of nonoverlapping domains and considering a refined spatial

  10. Patient size and x-ray technique factors in head computed tomography examinations. I. Radiation doses

    International Nuclear Information System (INIS)

    Huda, Walter; Lieberman, Kristin A.; Chang, Jack; Roskopf, Marsha L.

    2004-01-01

    We investigated how patient age, size and composition, together with the choice of x-ray technique factors, affect radiation doses in head computed tomography (CT) examinations. Head size dimensions, cross-sectional areas, and mean Hounsfield unit (HU) values were obtained from head CT images of 127 patients. For radiation dosimetry purposes patients were modeled as uniform cylinders of water. Dose computations were performed for 18x7 mm sections, scanned at a constant 340 mAs, for x-ray tube voltages ranging from 80 to 140 kV. Values of mean section dose, energy imparted, and effective dose were computed for patients ranging from the newborn to adults. There was a rapid growth of head size over the first two years, followed by a more modest increase of head size until the age of 18 or so. Newborns have a mean HU value of about 50 that monotonically increases with age over the first two decades of life. Average adult A-P and lateral dimensions were 186±8 mm and 147±8 mm, respectively, with an average HU value of 209±40. An infant head was found to be equivalent to a water cylinder with a radius of ∼60 mm, whereas an adult head had an equivalent radius 50% greater. Adult males head dimensions are about 5% larger than for females, and their average x-ray attenuation is ∼20 HU greater. For adult examinations performed at 120 kV, typical values were 32 mGy for the mean section dose, 105 mJ for the total energy imparted, and 0.64 mSv for the effective dose. Increasing the x-ray tube voltage from 80 to 140 kV increases patient doses by about a factor of 5. For the same technique factors, mean section doses in infants are 35% higher than in adults. Energy imparted for adults is 50% higher than for infants, but infant effective doses are four times higher than for adults. CT doses need to take into account patient age, head size, and composition as well as the selected x-ray technique factors

  11. Gamma self-shielding correction factors calculation for aqueous bulk sample analysis by PGNAA technique

    International Nuclear Information System (INIS)

    Nasrabadi, M.N.; Mohammadi, A.; Jalali, M.

    2009-01-01

    In this paper bulk sample prompt gamma neutron activation analysis (BSPGNAA) was applied to aqueous sample analysis using a relative method. For elemental analysis of an unknown bulk sample, gamma self-shielding coefficient was required. Gamma self-shielding coefficient of unknown samples was estimated by an experimental method and also by MCNP code calculation. The proposed methodology can be used for the determination of the elemental concentration of unknown aqueous samples by BSPGNAA where knowledge of the gamma self-shielding within the sample volume is required.

  12. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  13. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.

    2015-01-01

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  14. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  15. Development of DNA affinity techniques for the functional characterization of purified RNA polymerase II transcription factors

    International Nuclear Information System (INIS)

    Garfinkel, S.; Thompson, J.A.; Cohen, R.B.; Brendler, T.; Safer, B.

    1987-01-01

    Affinity adsorption, precipitation, and partitioning techniques have been developed to purify and characterize RNA Pol II transcription components from whole cell extracts (WCE) (HeLa) and nuclear extracts (K562). The titration of these extracts with multicopy constructs of the Ad2 MLP but not pUC8, inhibits transcriptional activity. DNA-binding factors precipitated by this technique are greatly enriched by centrifugation. Using this approach, factors binding to the upstream promoter sequence (UPS) of the Ad2 MLP have been rapidly isolated by Mono Q, Mono S, and DNA affinity chromatography. By U.V. crosslinking to nucleotides containing specific 32 P-phosphodiester bonds within the recognition sequence, this factor is identified as a M/sub r/ = 45,000 polypeptide. To generate an assay system for the functional evaluation of single transcription components, a similar approach using synthetic oligonucleotide sequences spanning single promoter binding sites has been developed. The addition of a synthetic 63-mer containing the UPS element of the Ad2 MLP to HeLa WCE inhibited transcription by 60%. The addition of partially purified UPS binding protein, but not RNA Pol II, restored transcriptional activity. The addition of synthetic oligonucleotides containing other regulatory sequences not present in the Ad2 MLP was without effect

  16. Rapid analysis of molybdenum contents in molybdenum master alloys by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Tongkong, P.

    1985-01-01

    Determination of molybdenum contents in molybdenum master alloy had been performed using energy dispersive x-ray fluorescence (EDX) technique where analysis were made via standard additions and calibration curves. Comparison of EDX technique with other analyzing techniques, i.e., wavelength dispersive x-ray fluorescence, neutron activation analysis and inductive coupled plasma spectrometry, showed consistency in the results. This technique was found to yield reliable results when molybdenum contents in master alloys were in the range of 13 to 50 percent using HPGe detector or proportional counter. When the required error was set at 1%, the minimum analyzing time was found to be 30 and 60 seconds for Fe-Mo master alloys with molybdenum content of 13.54 and 49.09 percent respectively. For Al-Mo master alloys, the minimum times required were 120 and 300 seconds with molybdenum content of 15.22 and 47.26 percent respectively

  17. Analysis of archaeological pieces with nuclear techniques; Analisis de piezas arqueologicas con tecnicas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Tenorio, D [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2002-07-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  18. DESIGN & ANALYSIS TOOLS AND TECHNIQUES FOR AEROSPACE STRUCTURES IN A 3D VISUAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2009-09-01

    Full Text Available The main objective of this project is to develop a set of tools and to integrate techniques in a software package which is build on structure analysis applications based on Romanian engineers experience in designing and analysing aerospace structures, consolidated with the most recent methods and techniques. The applications automates the structure’s design and analysis processes and facilitate the exchange of technical information between the partners involved in a complex aerospace project without limiting the domain.

  19. Micro-computed tomography and bond strength analysis of different root canal filling techniques

    Directory of Open Access Journals (Sweden)

    Juliane Nhata

    2014-01-01

    Full Text Available Introduction: The aim of this study was to evaluate the quality and bond strength of three root filling techniques (lateral compaction, continuous wave of condensation and Tagger′s Hybrid technique [THT] using micro-computed tomography (CT images and push-out tests, respectively. Materials and Methods: Thirty mandibular incisors were prepared using the same protocol and randomly divided into three groups (n = 10: Lateral condensation technique (LCT, continuous wave of condensation technique (CWCT, and THT. All specimens were filled with Gutta-percha (GP cones and AH Plus sealer. Five specimens of each group were randomly chosen for micro-CT analysis and all of them were sectioned into 1 mm slices and subjected to push-out tests. Results: Micro-CT analysis revealed less empty spaces when GP was heated within the root canals in CWCT and THT when compared to LCT. Push-out tests showed that LCT and THT had a significantly higher displacement resistance (P < 0.05 when compared to the CWCT. Bond strength was lower in apical and middle thirds than in the coronal thirds. Conclusions: It can be concluded that LCT and THT were associated with higher bond strengths to intraradicular dentine than CWCT. However, LCT was associated with more empty voids than the other techniques.

  20. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient response of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5. 12 refs., 3 figs

  1. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient responses of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5

  2. Artificial Intelligence Techniques for Automatic Screening of Amblyogenic Factors

    Science.gov (United States)

    Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard

    2008-01-01

    Purpose To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. Methods In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. Results The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the “gold standard” specialist examination with a “refer/do not refer” decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than −7. Conclusions Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years. PMID:19277222

  3. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  4. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  5. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  6. [Analysis of lifestyle and risk factors of atherosclerosis in students of selected universities in Krakow].

    Science.gov (United States)

    Skrzypek, Agnieszka; Szeliga, Marta; Stalmach-Przygoda, Agata; Kowalska, Bogumila; Jabłoński, Konrad; Nowakowski, Michal

    Reduction of risk factors of atherosclerosis, lifestyle modification significantly cause the reduction in the incidence, morbidity and mortality of cardiovascular diseases (CVDs). Objective: To evaluate cardiovascular risk factors and analyze the lifestyle of students finishing the first year of studies at selected universities in Krakow. The study was performed in 2015roku. 566 students finishing the first year of study, including 319 (56.4%) men and 247 (43.6%) women were examined. The students were in age from 18 to 27 years, an average of 20.11± 1.15 years. They represented 6 different universities in Cracow. In order to assess eating habits, lifestyle and analysis of risk factors of cardiovascular disease was used method of diagnostic survey using the survey technique. BMI was calculated from anthropometric measurements. The program Statistica 12.0 were used in statistical analysis. The analysis showed that most fruits and vegetables consume UR students and AWF, least of AGH. Only 34.8% of students regularly consume fish of the sea, there were no significant differences between universities. Sports frequently cultivate the students of AWF (93% of the students of this university). Academy of Fine Arts students drink the most coffee. Students of AGH frequently consume alcohol. 60% of all students never tried drugs, but only 25.7% of student of Fine Arts never tried drugs. Overweight occurs in 12.6% of students, and obesity in 1.1%. The most risk factors of atherosclerosis occur in students of AGH and ASP. The results of the study clearly indicate on the necessity of implementation of prevention and improvement of health behaviors in students of AGH and ASP universities.

  7. Analysis of factors in successful nasal endoscopic resection of nasopharyngeal angiofibroma.

    Science.gov (United States)

    Ye, Dong; Shen, Zhisen; Wang, Guoli; Deng, Hongxia; Qiu, Shijie; Zhang, Yuna

    2016-01-01

    Endoscopic resection of nasopharyngeal angiofibroma is less traumatic, causes less bleeding, and provides a good curative effect. Using pre-operative embolization and controlled hypotension, reasonable surgical strategies and techniques lead to successful resection tumors of a maximum Andrews-Fisch classification stage of III. To investigate surgical indications, methods, surgical technique, and curative effects of transnasal endoscopic resection of nasopharyngeal angiofibroma, this study evaluated factors that improve diagnosis and treatment, prevent large intra-operative blood loss and residual tumor, and increase the cure rate. A retrospective analysis was performed of the clinical data and treatment programs of 23 patients with nasopharyngeal angiofibroma who underwent endoscopic resection with pre-operative embolization and controlled hypotension. The surgical method applied was based on the size of tumor and extent of invasion. Curative effects were observed. No intra-operative or perioperative complications were observed in 22 patients. Upon removal of nasal packing material 3-7 days post-operatively, one patient experienced heavy bleeding of the nasopharyngeal wound, which was treated compression hemostasis using post-nasal packing. Twenty-three patients were followed up for 6-60 months. Twenty-two patients experienced cure; one patient experienced recurrence 10 months post-operatively, and repeat nasal endoscopic surgery was performed and resulted in cure.

  8. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis

    NARCIS (Netherlands)

    Genugten, L. van; Dusseldorp, E.; Massey, E.K.; Empelen, P. van

    2017-01-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary

  9. Nonliner analysis techniques for use in the assessment of high-level waste storage tank structures

    International Nuclear Information System (INIS)

    Moore, C.J.; Julyk, L.J.; Fox, G.L.; Dyrness, A.D.

    1991-09-01

    Reinforced concrete in combination with a steel liner has had a wide application to structures containing hazardous material. The buried double-shell waste storage tanks at the US Department of Energy's Hanford Site use this construction method. The generation and potential ignition of combustible gases within the primary tank is postulated to develop beyond-design-basis internal pressure and possible impact loading. The scope of this paper includes the illustration of analysis techniques for the assessment of these beyond-design-basis loadings. The analysis techniques include the coupling of the gas dynamics with the structural response, the treatment of reinforced concrete in regimes of inelastic behavior, and the treatment of geometric nonlinearities. The techniques and software tools presented provide a powerful nonlinear analysis capability for storage tanks. 10 refs., 13 figs., 1 tab

  10. Consumer Ethical Decision Making: Intensity, Self-Consciousness and Neutralization Techniques

    OpenAIRE

    Syed Afzal Moshadi Shah; Shehla Amjad

    2017-01-01

    The purpose of the study is to examine the effect of moral intensity on self-conscious emotions and neutralization techniques in the context of ethical decision making among consumers. A sample of 388 shopping mall retail consumers was recruited through self-administered survey technique. Descriptive statistics, exploratory factor analysis, correlation was carried out in SPSS whereas the measurement model and structural relationships were estimated using AMOS. Results indicate ...

  11. Performance values of nondestructive analysis techniques in safeguards and nuclear materials management

    International Nuclear Information System (INIS)

    Guardini, S.

    1989-01-01

    Nondestructive assay (NDA) techniques have, in the past few years, become more and more important in nuclear material accountancy and control. This is essentially due to two reasons: (1) The improvements made in most NDA techniques led some of them to have performances close to destructive analysis (DA) (e.g., calorimetry and gamma spectrometry). (2) The parallel improvement of statistical tools and procedural inspection approaches led to abandoning the following scheme: (a) NDA for semiqualitative or consistency checks only (b) DA for quantitative measurements. As a consequence, NDA is now frequently used in scenarios that involve quantitative (by variable) analysis. On the other hand, it also became evident that the performances of some techniques were different depending on whether they were applied in the laboratory or in the field. It has only recently been realized that, generally speaking, this is due to objective reasons rather than to an incorrect application of the instruments. Speaking of claimed and actual status of NDA performances might be in this sense misleading; one should rather say: performances in different conditions. This paper provides support for this assumption

  12. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  13. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. Review of human factors guidelines and methods

    International Nuclear Information System (INIS)

    Rhodes, W.; Szlapetis, I.; Hay, T.; Weihrer, S.

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs

  15. Review of human factors guidelines and methods

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, W; Szlapetis, I; Hay, T; Weihrer, S [Rhodes and Associates Inc., Toronto, ON (Canada)

    1995-04-01

    The review examines the use of human factors guidelines and methods in high technology applications, with emphasis on application to the nuclear industry. An extensive literature review was carried out identifying over 250 applicable documents, with 30 more documents identified during interviews with experts in human factors. Surveys were sent to 15 experts, of which 11 responded. The survey results indicated guidelines used and why these were favoured. Thirty-three of the most applicable guideline documents were described in detailed annotated bibliographies. A bibliographic list containing over 280 references was prepared. Thirty guideline documents were rated for their completeness, validity, applicability and practicality. The experts survey indicated the use of specific techniques. Ten human factors methods of analysis were described in general summaries, including procedures, applications, and specific techniques. Detailed descriptions of the techniques were prepared and each technique rated for applicability and practicality. Recommendations for further study of areas of importance to human factors in the nuclear field in Canada are given. (author). 8 tabs., 2 figs.

  16. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    Science.gov (United States)

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  17. Comparative factor analysis models for an empirical study of EEG data, II: A data-guided resolution of the rotation indeterminacy.

    Science.gov (United States)

    Rogers, L J; Douglas, R R

    1984-02-01

    In this paper (the second in a series), we consider a (generic) pair of datasets, which have been analyzed by the techniques of the previous paper. Thus, their "stable subspaces" have been established by comparative factor analysis. The pair of datasets must satisfy two confirmable conditions. The first is the "Inclusion Condition," which requires that the stable subspace of one of the datasets is nearly identical to a subspace of the other dataset's stable subspace. On the basis of that, we have assumed the pair to have similar generating signals, with stochastically independent generators. The second verifiable condition is that the (presumed same) generating signals have distinct ratios of variances for the two datasets. Under these conditions a small elaboration of some elementary linear algebra reduces the rotation problem to several eigenvalue-eigenvector problems. Finally, we emphasize that an analysis of each dataset by the method of Douglas and Rogers (1983) is an essential prerequisite for the useful application of the techniques in this paper. Nonempirical methods of estimating the number of factors simply will not suffice, as confirmed by simulations reported in the previous paper.

  18. Multiplex Ligation-Dependent Probe Amplification Technique for Copy Number Analysis on Small Amounts of DNA Material

    DEFF Research Database (Denmark)

    Sørensen, Karina; Andersen, Paal; Larsen, Lars

    2008-01-01

    The multiplex ligation-dependent probe amplification (MLPA) technique is a sensitive technique for relative quantification of up to 50 different nucleic acid sequences in a single reaction, and the technique is routinely used for copy number analysis in various syndromes and diseases. The aim...... of the study was to exploit the potential of MLPA when the DNA material is limited. The DNA concentration required in standard MLPA analysis is not attainable from dried blood spot samples (DBSS) often used in neonatal screening programs. A novel design of MLPA probes has been developed to permit for MLPA...... analysis on small amounts of DNA. Six patients with congenital adrenal hyperplasia (CAH) were used in this study. DNA was extracted from both whole blood and DBSS and subjected to MLPA analysis using normal and modified probes. Results were analyzed using GeneMarker and manual Excel analysis. A total...

  19. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  20. Analysis of various NDT techniques to determine their feasibility for detecting thin layers of ferrite on Type 316 stainless steel

    International Nuclear Information System (INIS)

    Dudder, G.B.; Atteridge, D.G.; Davis, T.J.

    1978-09-01

    The applicability of various NDT techniques for detecting thin layers of ferrite on Type 316 stainless steel cladding was studied. The ability to detect sodium-induced ferrite layers on fuel pins would allow an experimental determination of the fuel pin temperature distribution. The research effort was broken down into three basic sections. Phase one consisted of a theoretical determination of the ferrite detection potential of each of the propsed NDT techniques. The second phase consisted of proof-of-principle experiments on the techniques that passed phase one. The third phase consisted of in-hot cell testing on actual EBR-II fuel pins. Most of the candidate techniques were eliminated in the first phase of analysis. Four potential techniques passed the initial phase of analysis but only three of these passed the second analysis phase. The three techniques that passed the proof-of-principle section of analysis were heat tinting, magnetic force and electromagnetic techniques. The electromagnetic technique was successfully demonstrated on actual fuel pins irradiated in EBR-II in the third phase of analysis while the other two techniques were not carried to the hot cell analysis phase. Results of this technique screening study indicates that an electromagnetic and/or heat tinting ferrite layer NDT technique should be readily adoptable to hot cell inspection requirements. It wasalso concluded that the magnetic force technique, while feasible, would not readily lend itself to hot cell fuel pin inspection